Sample records for parameterized entropy analysis

  1. Using the Maximum Entropy Principle as a Unifying Theory Characterization and Sampling of Multi-Scaling Processes in Hydrometeorology

    DTIC Science & Technology

    2015-08-20

    evapotranspiration (ET) over oceans may be significantly lower than previously thought. The MEP model parameterized turbulent transfer coefficients...fluxes, ocean freshwater fluxes, regional crop yield among others. An on-going study suggests that the global annual evapotranspiration (ET) over...Bras, Jingfeng Wang. A model of evapotranspiration based on the theory of maximum entropy production, Water Resources Research, (03 2011): 0. doi

  2. QCD equation of state at nonzero chemical potential: continuum results with physical quark masses at order μ 2

    NASA Astrophysics Data System (ADS)

    Borsányi, Sz.; Endrődi, G.; Fodor, Z.; Katz, S. D.; Krieg, S.; Ratti, C.; Szabó, K. K.

    2012-08-01

    We determine the equation of state of QCD for nonzero chemical potentials via a Taylor expansion of the pressure. The results are obtained for N f = 2 + 1 flavors of quarks with physical masses, on various lattice spacings. We present results for the pressure, interaction measure, energy density, entropy density, and the speed of sound for small chemical potentials. At low temperatures we compare our results with the Hadron Resonance Gas model. We also express our observables along trajectories of constant entropy over particle number. A simple parameterization is given (the Matlab/Octave script parameterization.m, submitted to the arXiv along with the paper), which can be used to reconstruct the observables as functions of T and μ, or as functions of T and S/N.

  3. Analysis of watershed topography effects on summer precipitation variability in the southwestern United States

    NASA Astrophysics Data System (ADS)

    Sohoulande Djebou, Dagbegnon C.; Singh, Vijay P.; Frauenfeld, Oliver W.

    2014-04-01

    With climate change, precipitation variability is projected to increase. The present study investigates the potential interactions between watershed characteristics and precipitation variability. The watershed is considered as a functional unit that may impact seasonal precipitation. The study uses historical precipitation data from 370 meteorological stations over the last five decades, and digital elevation data from regional watersheds in the southwestern United States. This domain is part of the North American Monsoon region, and the summer period (June-July-August, JJA) was considered. Based on an initial analysis for 1895-2011, the JJA precipitation accounts, on average, for 22-43% of the total annual precipitation, with higher percentages in the arid part of the region. The unique contribution of this research is that entropy theory is used to address precipitation variability in time and space. An entropy-based disorder index was computed for each station's precipitation record. The JJA total precipitation and number of precipitation events were considered in the analysis. The precipitation variability potentially induced by watershed topography was investigated using spatial regionalization combining principal component and cluster analysis. It was found that the disorder in precipitation total and number of events tended to be higher in arid regions. The spatial pattern showed that the entropy-based variability in precipitation amount and number of events gradually increased from east to west in the southwestern United States. Regarding the watershed topography influence on summer precipitation patterns, hilly relief has a stabilizing effect on seasonal precipitation variability in time and space. The results show the necessity to include watershed topography in global and regional climate model parameterizations.

  4. Coarse-grained models using local-density potentials optimized with the relative entropy: Application to implicit solvation

    NASA Astrophysics Data System (ADS)

    Sanyal, Tanmoy; Shell, M. Scott

    2016-07-01

    Bottom-up multiscale techniques are frequently used to develop coarse-grained (CG) models for simulations at extended length and time scales but are often limited by a compromise between computational efficiency and accuracy. The conventional approach to CG nonbonded interactions uses pair potentials which, while computationally efficient, can neglect the inherently multibody contributions of the local environment of a site to its energy, due to degrees of freedom that were coarse-grained out. This effect often causes the CG potential to depend strongly on the overall system density, composition, or other properties, which limits its transferability to states other than the one at which it was parameterized. Here, we propose to incorporate multibody effects into CG potentials through additional nonbonded terms, beyond pair interactions, that depend in a mean-field manner on local densities of different atomic species. This approach is analogous to embedded atom and bond-order models that seek to capture multibody electronic effects in metallic systems. We show that the relative entropy coarse-graining framework offers a systematic route to parameterizing such local density potentials. We then characterize this approach in the development of implicit solvation strategies for interactions between model hydrophobes in an aqueous environment.

  5. Exploiting Acoustic and Syntactic Features for Automatic Prosody Labeling in a Maximum Entropy Framework

    PubMed Central

    Sridhar, Vivek Kumar Rangarajan; Bangalore, Srinivas; Narayanan, Shrikanth S.

    2009-01-01

    In this paper, we describe a maximum entropy-based automatic prosody labeling framework that exploits both language and speech information. We apply the proposed framework to both prominence and phrase structure detection within the Tones and Break Indices (ToBI) annotation scheme. Our framework utilizes novel syntactic features in the form of supertags and a quantized acoustic–prosodic feature representation that is similar to linear parameterizations of the prosodic contour. The proposed model is trained discriminatively and is robust in the selection of appropriate features for the task of prosody detection. The proposed maximum entropy acoustic–syntactic model achieves pitch accent and boundary tone detection accuracies of 86.0% and 93.1% on the Boston University Radio News corpus, and, 79.8% and 90.3% on the Boston Directions corpus. The phrase structure detection through prosodic break index labeling provides accuracies of 84% and 87% on the two corpora, respectively. The reported results are significantly better than previously reported results and demonstrate the strength of maximum entropy model in jointly modeling simple lexical, syntactic, and acoustic features for automatic prosody labeling. PMID:19603083

  6. Entropy-Based Search Algorithm for Experimental Design

    NASA Astrophysics Data System (ADS)

    Malakar, N. K.; Knuth, K. H.

    2011-03-01

    The scientific method relies on the iterated processes of inference and inquiry. The inference phase consists of selecting the most probable models based on the available data; whereas the inquiry phase consists of using what is known about the models to select the most relevant experiment. Optimizing inquiry involves searching the parameterized space of experiments to select the experiment that promises, on average, to be maximally informative. In the case where it is important to learn about each of the model parameters, the relevance of an experiment is quantified by Shannon entropy of the distribution of experimental outcomes predicted by a probable set of models. If the set of potential experiments is described by many parameters, we must search this high-dimensional entropy space. Brute force search methods will be slow and computationally expensive. We present an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment for efficient experimental design. This algorithm is inspired by Skilling's nested sampling algorithm used in inference and borrows the concept of a rising threshold while a set of experiment samples are maintained. We demonstrate that this algorithm not only selects highly relevant experiments, but also is more efficient than brute force search. Such entropic search techniques promise to greatly benefit autonomous experimental design.

  7. Is Water at the Graphite Interface Vapor-like or Ice-like?

    PubMed

    Qiu, Yuqing; Lupi, Laura; Molinero, Valeria

    2018-04-05

    Graphitic surfaces are the main component of soot, a major constituent of atmospheric aerosols. Experiments indicate that soots of different origins display a wide range of abilities to heterogeneously nucleate ice. The ability of pure graphite to nucleate ice in experiments, however, seems to be almost negligible. Nevertheless, molecular simulations with the monatomic water model mW with water-carbon interactions parameterized to reproduce the experimental contact angle of water on graphite predict that pure graphite nucleates ice. According to classical nucleation theory, the ability of a surface to nucleate ice is controlled by the binding free energy between ice immersed in liquid water and the surface. To establish whether the discrepancy in freezing efficiencies of graphite in mW simulations and experiments arises from the coarse resolution of the model or can be fixed by reparameterization, it is important to elucidate the contributions of the water-graphite, water-ice, and ice-water interfaces to the free energy, enthalpy, and entropy of binding for both water and the model. Here we use thermodynamic analysis and free energy calculations to determine these interfacial properties. We demonstrate that liquid water at the graphite interface is not ice-like or vapor-like: it has similar free energy, entropy, and enthalpy as water in the bulk. The thermodynamics of the water-graphite interface is well reproduced by the mW model. We find that the entropy of binding between graphite and ice is positive and dominated, in both experiments and simulations, by the favorable entropy of reducing the ice-water interface. Our analysis indicates that the discrepancy in freezing efficiencies of graphite in experiments and the simulations with mW arises from the inability of the model to simultaneously reproduce the contact angle of liquid water on graphite and the free energy of the ice-graphite interface. This transferability issue is intrinsic to the resolution of the model, and arises from its lack of rotational degrees of freedom.

  8. Coarse-grained models using local-density potentials optimized with the relative entropy: Application to implicit solvation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanyal, Tanmoy; Shell, M. Scott, E-mail: shell@engineering.ucsb.edu

    Bottom-up multiscale techniques are frequently used to develop coarse-grained (CG) models for simulations at extended length and time scales but are often limited by a compromise between computational efficiency and accuracy. The conventional approach to CG nonbonded interactions uses pair potentials which, while computationally efficient, can neglect the inherently multibody contributions of the local environment of a site to its energy, due to degrees of freedom that were coarse-grained out. This effect often causes the CG potential to depend strongly on the overall system density, composition, or other properties, which limits its transferability to states other than the one atmore » which it was parameterized. Here, we propose to incorporate multibody effects into CG potentials through additional nonbonded terms, beyond pair interactions, that depend in a mean-field manner on local densities of different atomic species. This approach is analogous to embedded atom and bond-order models that seek to capture multibody electronic effects in metallic systems. We show that the relative entropy coarse-graining framework offers a systematic route to parameterizing such local density potentials. We then characterize this approach in the development of implicit solvation strategies for interactions between model hydrophobes in an aqueous environment.« less

  9. CALIBRATION OF THE MIXING-LENGTH THEORY FOR CONVECTIVE WHITE DWARF ENVELOPES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tremblay, P.-E.; Ludwig, H.-G.; Freytag, B.

    2015-02-01

    A calibration of the mixing-length parameter in the local mixing-length theory (MLT) is presented for the lower part of the convection zone in pure-hydrogen-atmosphere white dwarfs. The parameterization is performed from a comparison of three-dimensional (3D) CO5BOLD simulations with a grid of one-dimensional (1D) envelopes with a varying mixing-length parameter. In many instances, the 3D simulations are restricted to the upper part of the convection zone. The hydrodynamical calculations suggest, in those cases, that the entropy of the upflows does not change significantly from the bottom of the convection zone to regions immediately below the photosphere. We rely on thismore » asymptotic entropy value, characteristic of the deep and adiabatically stratified layers, to calibrate 1D envelopes. The calibration encompasses the convective hydrogen-line (DA) white dwarfs in the effective temperature range 6000 ≤ T {sub eff} (K) ≤15, 000 and the surface gravity range 7.0 ≤ log g ≤ 9.0. It is established that the local MLT is unable to reproduce simultaneously the thermodynamical, flux, and dynamical properties of the 3D simulations. We therefore propose three different parameterizations for these quantities. The resulting calibration can be applied to structure and envelope calculations, in particular for pulsation, chemical diffusion, and convective mixing studies. On the other hand, convection has no effect on the white dwarf cooling rates until there is a convective coupling with the degenerate core below T {sub eff} ∼ 5000 K. In this regime, the 1D structures are insensitive to the MLT parameterization and converge to the mean 3D results, hence they remain fully appropriate for age determinations.« less

  10. Actual and Idealized Crystal Field Parameterizations for the Uranium Ions in UF 4

    NASA Astrophysics Data System (ADS)

    Gajek, Z.; Mulak, J.; Krupa, J. C.

    1993-12-01

    The crystal field parameters for the actual coordination symmetries of the uranium ions in UF 4, C2 and C1, and for their idealizations to D2, C2 v , D4, D4 d , and the Archimedean antiprism point symmetries are given. They have been calculated by means of both the perturbative ab initio model and the angular overlap model and are referenced to the recent results fitted by Carnall's group. The equivalency of some different sets of parameters has been verified with the standardization procedure. The adequacy of several idealized approaches has been tested by comparison of the corresponding splitting patterns of the 3H 4 ground state. Our results support the parameterization given by Carnall. Furthermore, the parameterization of the crystal field potential and the splitting diagram for the symmetryless uranium ion U( C1) are given. Having at our disposal the crystal field splittings for the two kinds of uranium ions in UF 4, U( C2) and U( C1), we calculate the model plots of the paramagnetic susceptibility χ( T) and the magnetic entropy associated with the Schottky anomaly Δ S( T) for UF 4.

  11. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non-rich vectors, does not involve variational theory and does not involve differential equations, but is a better approximation of the minimal entropy path distance than the distance //b-a//(2). We compute minimal entropy distance matrices for examples of DNA myostatin genes and amino-acid sequences across several species. Output tree dendograms for our minimal entropy metric are compared with dendograms based on BLAST and BLAST identity scores.

  12. Modification and optimization of the united-residue (UNRES) potential-energy function for canonical simulations. I. Temperature dependence of the effective energy function and tests of the optimization method with single training proteins

    PubMed Central

    Liwo, Adam; Khalili, Mey; Czaplewski, Cezary; Kalinowski, Sebastian; Ołdziej, Stanisław; Wachucik, Katarzyna; Scheraga, Harold A.

    2011-01-01

    We report the modification and parameterization of the united-residue (UNRES) force field for energy-based protein-structure prediction and protein-folding simulations. We tested the approach on three training proteins separately: 1E0L (β), 1GAB (α), and 1E0G (α + β). Heretofore, the UNRES force field had been designed and parameterized to locate native-like structures of proteins as global minima of their effective potential-energy surfaces, which largely neglected the conformational entropy because decoys composed of only lowest-energy conformations were used to optimize the force field. Recently, we developed a mesoscopic dynamics procedure for UNRES, and applied it with success to simulate protein folding pathways. How ever, the force field turned out to be largely biased towards α-helical structures in canonical simulations because the conformational entropy had been neglected in the parameterization. We applied the hierarchical optimization method developed in our earlier work to optimize the force field, in which the conformational space of a training protein is divided into levels each corresponding to a certain degree of native-likeness. The levels are ordered according to increasing native-likeness; level 0 corresponds to structures with no native-like elements and the highest level corresponds to the fully native-like structures. The aim of optimization is to achieve the order of the free energies of levels, decreasing as their native-likeness increases. The procedure is iterative, and decoys of the training protein(s) generated with the energy-function parameters of the preceding iteration are used to optimize the force field in a current iteration. We applied the multiplexing replica exchange molecular dynamics (MREMD) method, recently implemented in UNRES, to generate decoys; with this modification, conformational entropy is taken into account. Moreover, we optimized the free-energy gaps between levels at temperatures corresponding to a predominance of folded or unfolded structures, as well as to structures at the putative folding-transition temperature, changing the sign of the gaps at the transition temperature. This enabled us to obtain force fields characterized by a single peak in the heat capacity at the transition temperature. Furthermore, we introduced temperature dependence to the UNRES force field; this is consistent with the fact that it is a free-energy and not a potential-energy function. PMID:17201450

  13. Double symbolic joint entropy in nonlinear dynamic complexity analysis

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-07-01

    Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.

  14. Handwriting: Feature Correlation Analysis for Biometric Hashes

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Steinmetz, Ralf

    2004-12-01

    In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  15. Maximum entropy production allows a simple representation of heterogeneity in semiarid ecosystems.

    PubMed

    Schymanski, Stanislaus J; Kleidon, Axel; Stieglitz, Marc; Narula, Jatin

    2010-05-12

    Feedbacks between water use, biomass and infiltration capacity in semiarid ecosystems have been shown to lead to the spontaneous formation of vegetation patterns in a simple model. The formation of patterns permits the maintenance of larger overall biomass at low rainfall rates compared with homogeneous vegetation. This results in a bias of models run at larger scales neglecting subgrid-scale variability. In the present study, we investigate the question whether subgrid-scale heterogeneity can be parameterized as the outcome of optimal partitioning between bare soil and vegetated area. We find that a two-box model reproduces the time-averaged biomass of the patterns emerging in a 100 x 100 grid model if the vegetated fraction is optimized for maximum entropy production (MEP). This suggests that the proposed optimality-based representation of subgrid-scale heterogeneity may be generally applicable to different systems and at different scales. The implications for our understanding of self-organized behaviour and its modelling are discussed.

  16. Global sensitivity analysis for fuzzy inputs based on the decomposition of fuzzy output entropy

    NASA Astrophysics Data System (ADS)

    Shi, Yan; Lu, Zhenzhou; Zhou, Yicheng

    2018-06-01

    To analyse the component of fuzzy output entropy, a decomposition method of fuzzy output entropy is first presented. After the decomposition of fuzzy output entropy, the total fuzzy output entropy can be expressed as the sum of the component fuzzy entropy contributed by fuzzy inputs. Based on the decomposition of fuzzy output entropy, a new global sensitivity analysis model is established for measuring the effects of uncertainties of fuzzy inputs on the output. The global sensitivity analysis model can not only tell the importance of fuzzy inputs but also simultaneously reflect the structural composition of the response function to a certain degree. Several examples illustrate the validity of the proposed global sensitivity analysis, which is a significant reference in engineering design and optimization of structural systems.

  17. Information Geometry for Landmark Shape Analysis: Unifying Shape Representation and Deformation

    PubMed Central

    Peter, Adrian M.; Rangarajan, Anand

    2010-01-01

    Shape matching plays a prominent role in the comparison of similar structures. We present a unifying framework for shape matching that uses mixture models to couple both the shape representation and deformation. The theoretical foundation is drawn from information geometry wherein information matrices are used to establish intrinsic distances between parametric densities. When a parameterized probability density function is used to represent a landmark-based shape, the modes of deformation are automatically established through the information matrix of the density. We first show that given two shapes parameterized by Gaussian mixture models (GMMs), the well-known Fisher information matrix of the mixture model is also a Riemannian metric (actually, the Fisher-Rao Riemannian metric) and can therefore be used for computing shape geodesics. The Fisher-Rao metric has the advantage of being an intrinsic metric and invariant to reparameterization. The geodesic—computed using this metric—establishes an intrinsic deformation between the shapes, thus unifying both shape representation and deformation. A fundamental drawback of the Fisher-Rao metric is that it is not available in closed form for the GMM. Consequently, shape comparisons are computationally very expensive. To address this, we develop a new Riemannian metric based on generalized ϕ-entropy measures. In sharp contrast to the Fisher-Rao metric, the new metric is available in closed form. Geodesic computations using the new metric are considerably more efficient. We validate the performance and discriminative capabilities of these new information geometry-based metrics by pairwise matching of corpus callosum shapes. We also study the deformations of fish shapes that have various topological properties. A comprehensive comparative analysis is also provided using other landmark-based distances, including the Hausdorff distance, the Procrustes metric, landmark-based diffeomorphisms, and the bending energies of the thin-plate (TPS) and Wendland splines. PMID:19110497

  18. Excess entropy and crystallization in Stillinger-Weber and Lennard-Jones fluids

    NASA Astrophysics Data System (ADS)

    Dhabal, Debdas; Nguyen, Andrew Huy; Singh, Murari; Khatua, Prabir; Molinero, Valeria; Bandyopadhyay, Sanjoy; Chakravarty, Charusita

    2015-10-01

    Molecular dynamics simulations are used to contrast the supercooling and crystallization behaviour of monatomic liquids that exemplify the transition from simple to anomalous, tetrahedral liquids. As examples of simple fluids, we use the Lennard-Jones (LJ) liquid and a pair-dominated Stillinger-Weber liquid (SW16). As examples of tetrahedral, water-like fluids, we use the Stillinger-Weber model with variable tetrahedrality parameterized for germanium (SW20), silicon (SW21), and water (SW23.15 or mW model). The thermodynamic response functions show clear qualitative differences between simple and water-like liquids. For simple liquids, the compressibility and the heat capacity remain small on isobaric cooling. The tetrahedral liquids in contrast show a very sharp rise in these two response functions as the lower limit of liquid-phase stability is reached. While the thermal expansivity decreases with temperature but never crosses zero in simple liquids, in all three tetrahedral liquids at the studied pressure, there is a temperature of maximum density below which thermal expansivity is negative. In contrast to the thermodynamic response functions, the excess entropy on isobaric cooling does not show qualitatively different features for simple and water-like liquids; however, the slope and curvature of the entropy-temperature plots reflect the heat capacity trends. Two trajectory-based computational estimation methods for the entropy and the heat capacity are compared for possible structural insights into supercooling, with the entropy obtained from thermodynamic integration. The two-phase thermodynamic estimator for the excess entropy proves to be fairly accurate in comparison to the excess entropy values obtained by thermodynamic integration, for all five Lennard-Jones and Stillinger-Weber liquids. The entropy estimator based on the multiparticle correlation expansion that accounts for both pair and triplet correlations, denoted by Strip, is also studied. Strip is a good entropy estimator for liquids where pair and triplet correlations are important such as Ge and Si, but loses accuracy for purely pair-dominated liquids, like LJ fluid, or near the crystallization temperature (Tthr). Since local tetrahedral order is compatible with both liquid and crystalline states, the reorganisation of tetrahedral liquids is accompanied by a clear rise in the pair, triplet, and thermodynamic contributions to the heat capacity, resulting in the heat capacity anomaly. In contrast, the pair-dominated liquids show increasing dominance of triplet correlations on approaching crystallization but no sharp rise in either the pair or thermodynamic heat capacities.

  19. Develop and Test a Solvent Accessible Surface Area-Based Model in Conformational Entropy Calculations

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2012-01-01

    It is of great interest in modern drug design to accurately calculate the free energies of protein-ligand or nucleic acid-ligand binding. MM-PBSA (Molecular Mechanics-Poisson Boltzmann Surface Area) and MM-GBSA (Molecular Mechanics-Generalized Born Surface Area) have gained popularity in this field. For both methods, the conformational entropy, which is usually calculated through normal mode analysis (NMA), is needed to calculate the absolute binding free energies. Unfortunately, NMA is computationally demanding and becomes a bottleneck of the MM-PB/GBSA-NMA methods. In this work, we have developed a fast approach to estimate the conformational entropy based upon solvent accessible surface area calculations. In our approach, the conformational entropy of a molecule, S, can be obtained by summing up the contributions of all atoms, no matter they are buried or exposed. Each atom has two types of surface areas, solvent accessible surface area (SAS) and buried SAS (BSAS). The two types of surface areas are weighted to estimate the contribution of an atom to S. Atoms having the same atom type share the same weight and a general parameter k is applied to balance the contributions of the two types of surface areas. This entropy model was parameterized using a large set of small molecules for which their conformational entropies were calculated at the B3LYP/6-31G* level taking the solvent effect into account. The weighted solvent accessible surface area (WSAS) model was extensively evaluated in three tests. For the convenience, TS, the product of temperature T and conformational entropy S, were calculated in those tests. T was always set to 298.15 K through the text. First of all, good correlations were achieved between WSAS TS and NMA TS for 44 protein or nucleic acid systems sampled with molecular dynamics simulations (10 snapshots were collected for post-entropy calculations): the mean correlation coefficient squares (R2) was 0.56. As to the 20 complexes, the TS changes upon binding, TΔS, were also calculated and the mean R2 was 0.67 between NMA and WSAS. In the second test, TS were calculated for 12 proteins decoy sets (each set has 31 conformations) generated by the Rosetta software package. Again, good correlations were achieved for all decoy sets: the mean, maximum, minimum of R2 were 0.73, 0.89 and 0.55, respectively. Finally, binding free energies were calculated for 6 protein systems (the numbers of inhibitors range from 4 to 18) using four scoring functions. Compared to the measured binding free energies, the mean R2 of the six protein systems were 0.51, 0.47, 0.40 and 0.43 for MM-GBSA-WSAS, MM-GBSA-NMA, MM-PBSA-WSAS and MM-PBSA-NMA, respectively. The mean RMS errors of prediction were 1.19, 1.24, 1.41, 1.29 kcal/mol for the four scoring functions, correspondingly. Therefore, the two scoring functions employing WSAS achieved a comparable prediction performance to that of the scoring functions using NMA. It should be emphasized that no minimization was performed prior to the WSAS calculation in the last test. Although WSAS is not as rigorous as physical models such as quasi-harmonic analysis and thermodynamic integration (TI), it is computationally very efficient as only surface area calculation is involved and no structural minimization is required. Moreover, WSAS has achieved a comparable performance to normal mode analysis. We expect that this model could find its applications in the fields like high throughput screening (HTS), molecular docking and rational protein design. In those fields, efficiency is crucial since there are a large number of compounds, docking poses or protein models to be evaluated. A list of acronyms and abbreviations used in this work is provided for quick reference. PMID:22497310

  20. A diameter-sensitive flow entropy method for reliability consideration in water distribution system design

    NASA Astrophysics Data System (ADS)

    Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin

    2014-07-01

    Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dhabal, Debdas; Chakravarty, Charusita, E-mail: charus@chemistry.iitd.ac.in; Nguyen, Andrew Huy

    Molecular dynamics simulations are used to contrast the supercooling and crystallization behaviour of monatomic liquids that exemplify the transition from simple to anomalous, tetrahedral liquids. As examples of simple fluids, we use the Lennard-Jones (LJ) liquid and a pair-dominated Stillinger-Weber liquid (SW{sub 16}). As examples of tetrahedral, water-like fluids, we use the Stillinger-Weber model with variable tetrahedrality parameterized for germanium (SW{sub 20}), silicon (SW{sub 21}), and water (SW{sub 23.15} or mW model). The thermodynamic response functions show clear qualitative differences between simple and water-like liquids. For simple liquids, the compressibility and the heat capacity remain small on isobaric cooling. Themore » tetrahedral liquids in contrast show a very sharp rise in these two response functions as the lower limit of liquid-phase stability is reached. While the thermal expansivity decreases with temperature but never crosses zero in simple liquids, in all three tetrahedral liquids at the studied pressure, there is a temperature of maximum density below which thermal expansivity is negative. In contrast to the thermodynamic response functions, the excess entropy on isobaric cooling does not show qualitatively different features for simple and water-like liquids; however, the slope and curvature of the entropy-temperature plots reflect the heat capacity trends. Two trajectory-based computational estimation methods for the entropy and the heat capacity are compared for possible structural insights into supercooling, with the entropy obtained from thermodynamic integration. The two-phase thermodynamic estimator for the excess entropy proves to be fairly accurate in comparison to the excess entropy values obtained by thermodynamic integration, for all five Lennard-Jones and Stillinger-Weber liquids. The entropy estimator based on the multiparticle correlation expansion that accounts for both pair and triplet correlations, denoted by S{sub trip}, is also studied. S{sub trip} is a good entropy estimator for liquids where pair and triplet correlations are important such as Ge and Si, but loses accuracy for purely pair-dominated liquids, like LJ fluid, or near the crystallization temperature (T{sub thr}). Since local tetrahedral order is compatible with both liquid and crystalline states, the reorganisation of tetrahedral liquids is accompanied by a clear rise in the pair, triplet, and thermodynamic contributions to the heat capacity, resulting in the heat capacity anomaly. In contrast, the pair-dominated liquids show increasing dominance of triplet correlations on approaching crystallization but no sharp rise in either the pair or thermodynamic heat capacities.« less

  2. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  3. Entropy for Mechanically Vibrating Systems

    NASA Astrophysics Data System (ADS)

    Tufano, Dante

    The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.

  4. Entropy for the Complexity of Physiological Signal Dynamics.

    PubMed

    Zhang, Xiaohua Douglas

    2017-01-01

    Recently, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of biological dynamics. Portable noninvasive medical devices are crucial to capture individual characteristics of biological dynamics. The wearable noninvasive medical devices and the analysis/management of related digital medical data will revolutionize the management and treatment of diseases, subsequently resulting in the establishment of a new healthcare system. One of the key features that can be extracted from the data obtained by wearable noninvasive medical device is the complexity of physiological signals, which can be represented by entropy of biological dynamics contained in the physiological signals measured by these continuous monitoring medical devices. Thus, in this chapter I present the major concepts of entropy that are commonly used to measure the complexity of biological dynamics. The concepts include Shannon entropy, Kolmogorov entropy, Renyi entropy, approximate entropy, sample entropy, and multiscale entropy. I also demonstrate an example of using entropy for the complexity of glucose dynamics.

  5. Time-series analysis of sleep wake stage of rat EEG using time-dependent pattern entropy

    NASA Astrophysics Data System (ADS)

    Ishizaki, Ryuji; Shinba, Toshikazu; Mugishima, Go; Haraguchi, Hikaru; Inoue, Masayoshi

    2008-05-01

    We performed electroencephalography (EEG) for six male Wistar rats to clarify temporal behaviors at different levels of consciousness. Levels were identified both by conventional sleep analysis methods and by our novel entropy method. In our method, time-dependent pattern entropy is introduced, by which EEG is reduced to binary symbolic dynamics and the pattern of symbols in a sliding temporal window is considered. A high correlation was obtained between level of consciousness as measured by the conventional method and mean entropy in our entropy method. Mean entropy was maximal while awake (stage W) and decreased as sleep deepened. These results suggest that time-dependent pattern entropy may offer a promising method for future sleep research.

  6. The Radius and Entropy of a Magnetized, Rotating, Fully Convective Star: Analysis with Depth-dependent Mixing Length Theories

    NASA Astrophysics Data System (ADS)

    Ireland, Lewis G.; Browning, Matthew K.

    2018-04-01

    Some low-mass stars appear to have larger radii than predicted by standard 1D structure models; prior work has suggested that inefficient convective heat transport, due to rotation and/or magnetism, may ultimately be responsible. We examine this issue using 1D stellar models constructed using Modules for Experiments in Stellar Astrophysics (MESA). First, we consider standard models that do not explicitly include rotational/magnetic effects, with convective inhibition modeled by decreasing a depth-independent mixing length theory (MLT) parameter α MLT. We provide formulae linking changes in α MLT to changes in the interior specific entropy, and hence to the stellar radius. Next, we modify the MLT formulation in MESA to mimic explicitly the influence of rotation and magnetism, using formulations suggested by Stevenson and MacDonald & Mullan, respectively. We find rapid rotation in these models has a negligible impact on stellar structure, primarily because a star’s adiabat, and hence its radius, is predominantly affected by layers near the surface; convection is rapid and largely uninfluenced by rotation there. Magnetic fields, if they influenced convective transport in the manner described by MacDonald & Mullan, could lead to more noticeable radius inflation. Finally, we show that these non-standard effects on stellar structure can be fabricated using a depth-dependent α MLT: a non-magnetic, non-rotating model can be produced that is virtually indistinguishable from one that explicitly parameterizes rotation and/or magnetism using the two formulations above. We provide formulae linking the radially variable α MLT to these putative MLT reformulations.

  7. EEG entropy measures in anesthesia

    PubMed Central

    Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli

    2015-01-01

    Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation efficiency compared with MDFA. Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure. Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA. PMID:25741277

  8. Hindcasting the Madden‐Julian Oscillation With a New Parameterization of Surface Heat Fluxes

    PubMed Central

    Wang, Jingfeng; Lin, Wenshi

    2017-01-01

    Abstract The recently developed maximum entropy production (MEP) model, an alternative parameterization of surface heat fluxes, is incorporated into the Weather Research and Forecasting (WRF) model. A pair of WRF cloud‐resolving experiments (5 km grids) using the bulk transfer model (WRF default) and the MEP model of surface heat fluxes are performed to hindcast the October Madden‐Julian oscillation (MJO) event observed during the 2011 Dynamics of the MJO (DYNAMO) field campaign. The simulated surface latent and sensible heat fluxes in the MEP and bulk transfer model runs are in general consistent with in situ observations from two research vessels. Compared to the bulk transfer model, the convection envelope is strengthened in the MEP run and shows a more coherent propagation over the Maritime Continent. The simulated precipitable water in the MEP run is in closer agreement with the observations. Precipitation in the MEP run is enhanced during the active phase of the MJO with significantly reduced regional dry and wet biases. Large‐scale ocean evaporation is stronger in the MEP run leading to stronger boundary layer moistening to the east of the convection center, which facilitates the eastward propagation of the MJO. PMID:29399269

  9. Refined two-index entropy and multiscale analysis for complex system

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2016-10-01

    As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.

  10. [Formula: see text] regularity properties of singular parameterizations in isogeometric analysis.

    PubMed

    Takacs, T; Jüttler, B

    2012-11-01

    Isogeometric analysis (IGA) is a numerical simulation method which is directly based on the NURBS-based representation of CAD models. It exploits the tensor-product structure of 2- or 3-dimensional NURBS objects to parameterize the physical domain. Hence the physical domain is parameterized with respect to a rectangle or to a cube. Consequently, singularly parameterized NURBS surfaces and NURBS volumes are needed in order to represent non-quadrangular or non-hexahedral domains without splitting, thereby producing a very compact and convenient representation. The Galerkin projection introduces finite-dimensional spaces of test functions in the weak formulation of partial differential equations. In particular, the test functions used in isogeometric analysis are obtained by composing the inverse of the domain parameterization with the NURBS basis functions. In the case of singular parameterizations, however, some of the resulting test functions do not necessarily fulfill the required regularity properties. Consequently, numerical methods for the solution of partial differential equations cannot be applied properly. We discuss the regularity properties of the test functions. For one- and two-dimensional domains we consider several important classes of singularities of NURBS parameterizations. For specific cases we derive additional conditions which guarantee the regularity of the test functions. In addition we present a modification scheme for the discretized function space in case of insufficient regularity. It is also shown how these results can be applied for computational domains in higher dimensions that can be parameterized via sweeping.

  11. Thermodynamic integration based on classical atomistic simulations to determine the Gibbs energy of condensed phases: Calculation of the aluminum-zirconium system

    NASA Astrophysics Data System (ADS)

    Harvey, J.-P.; Gheribi, A. E.; Chartrand, P.

    2012-12-01

    In this work, an in silico procedure to generate a fully coherent set of thermodynamic properties obtained from classical molecular dynamics (MD) and Monte Carlo (MC) simulations is proposed. The procedure is applied to the Al-Zr system because of its importance in the development of high strength Al-Li alloys and of bulk metallic glasses. Cohesive energies of the studied condensed phases of the Al-Zr system (the liquid phase, the fcc solid solution, and various orthorhombic stoichiometric compounds) are calculated using the modified embedded atom model (MEAM) in the second-nearest-neighbor formalism (2NN). The Al-Zr MEAM-2NN potential is parameterized in this work using ab initio and experimental data found in the literature for the AlZr3-L12 structure, while its predictive ability is confirmed for several other solid structures and for the liquid phase. The thermodynamic integration (TI) method is implemented in a general MC algorithm in order to evaluate the absolute Gibbs energy of the liquid and the fcc solutions. The entropy of mixing calculated from the TI method, combined to the enthalpy of mixing and the heat capacity data generated from MD/MC simulations performed in the isobaric-isothermal/canonical (NPT/NVT) ensembles are used to parameterize the Gibbs energy function of all the condensed phases in the Al-rich side of the Al-Zr system in a CALculation of PHAse Diagrams (CALPHAD) approach. The modified quasichemical model in the pair approximation (MQMPA) and the cluster variation method (CVM) in the tetrahedron approximation are used to define the Gibbs energy of the liquid and the fcc solid solution respectively for their entire range of composition. Thermodynamic and structural data generated from our MD/MC simulations are used as input data to parameterize these thermodynamic models. A detailed analysis of the validity and transferability of the Al-Zr MEAM-2NN potential is presented throughout our work by comparing the predicted properties obtained from this formalism with available ab initio and experimental data for both liquid and solid phases.

  12. Entropy Production in Collisionless Systems. II. Arbitrary Phase-space Occupation Numbers

    NASA Astrophysics Data System (ADS)

    Barnes, Eric I.; Williams, Liliya L. R.

    2012-04-01

    We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.

  13. Teaching Entropy Analysis in the First-Year High School Course and Beyond

    ERIC Educational Resources Information Center

    Bindel, Thomas H.

    2004-01-01

    A new method is presented, which educates and empowers the teachers and assists them in incorporating entropy analysis in their curricula and also provides an entropy-analysis unit that can be used in classrooms. The topics that the teachers can cover depending on the ability of the students and the comfort level of the teacher are included.

  14. Exploring stability of entropy analysis for signal with different trends

    NASA Astrophysics Data System (ADS)

    Zhang, Yin; Li, Jin; Wang, Jun

    2017-03-01

    Considering the effects of environment disturbances and instrument systems, the actual detecting signals always are carrying different trends, which result in that it is difficult to accurately catch signals complexity. So choosing steady and effective analysis methods is very important. In this paper, we applied entropy measures-the base-scale entropy and approximate entropy to analyze signal complexity, and studied the effect of trends on the ideal signal and the heart rate variability (HRV) signals, that is, linear, periodic, and power-law trends which are likely to occur in actual signals. The results show that approximate entropy is unsteady when we embed different trends into the signals, so it is not suitable to analyze signal with trends. However, the base-scale entropy has preferable stability and accuracy for signal with different trends. So the base-scale entropy is an effective method to analyze the actual signals.

  15. Effect of extreme data loss on heart rate signals quantified by entropy analysis

    NASA Astrophysics Data System (ADS)

    Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao

    2015-02-01

    The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.

  16. Analysis of interacting entropy-corrected holographic and new agegraphic dark energies

    NASA Astrophysics Data System (ADS)

    Ranjit, Chayan; Debnath, Ujjal

    In the present work, we assume the flat FRW model of the universe is filled with dark matter and dark energy where they are interacting. For dark energy model, we consider the entropy-corrected HDE (ECHDE) model and the entropy-corrected NADE (ECNADE). For entropy-corrected models, we assume logarithmic correction and power law correction. For ECHDE model, length scale L is assumed to be Hubble horizon and future event horizon. The ωde-ωde‧ analysis for our different horizons are discussed.

  17. Entropy in statistical energy analysis.

    PubMed

    Le Bot, Alain

    2009-03-01

    In this paper, the second principle of thermodynamics is discussed in the framework of statistical energy analysis (SEA). It is shown that the "vibrational entropy" and the "vibrational temperature" of sub-systems only depend on the vibrational energy and the number of resonant modes. A SEA system can be described as a thermodynamic system slightly out of equilibrium. In steady-state condition, the entropy exchanged with exterior by sources and dissipation exactly balances the production of entropy by irreversible processes at interface between SEA sub-systems.

  18. Low Streamflow Forcasting using Minimum Relative Entropy

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  19. Refined generalized multiscale entropy analysis for physiological signals

    NASA Astrophysics Data System (ADS)

    Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian

    2018-01-01

    Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.

  20. Thermodynamic and structural signatures of water-driven methane-methane attraction in coarse-grained mW water.

    PubMed

    Song, Bin; Molinero, Valeria

    2013-08-07

    Hydrophobic interactions are responsible for water-driven processes such as protein folding and self-assembly of biomolecules. Microscopic theories and molecular simulations have been used to study association of a pair of methanes in water, the paradigmatic example of hydrophobic attraction, and determined that entropy is the driving force for the association of the methane pair, while the enthalpy disfavors it. An open question is to which extent coarse-grained water models can still produce correct thermodynamic and structural signatures of hydrophobic interaction. In this work, we investigate the hydrophobic interaction between a methane pair in water at temperatures from 260 to 340 K through molecular dynamics simulations with the coarse-grained monatomic water model mW. We find that the coarse-grained model correctly represents the free energy of association of the methane pair, the temperature dependence of free energy, and the positive change in entropy and enthalpy upon association. We investigate the relationship between thermodynamic signatures and structural order of water through the analysis of the spatial distribution of the density, energy, and tetrahedral order parameter Qt of water. The simulations reveal an enhancement of tetrahedral order in the region between the first and second hydration shells of the methane molecules. The increase in tetrahedral order, however, is far from what would be expected for a clathrate-like or ice-like shell around the solutes. This work shows that the mW water model reproduces the key signatures of hydrophobic interaction without long ranged electrostatics or the need to be re-parameterized for different thermodynamic states. These characteristics, and its hundred-fold increase in efficiency with respect to atomistic models, make mW a promising water model for studying water-driven hydrophobic processes in more complex systems.

  1. Use of information entropy measures of sitting postural sway to quantify developmental delay in infants

    PubMed Central

    Deffeyes, Joan E; Harbourne, Regina T; DeJong, Stacey L; Kyvelidou, Anastasia; Stuberg, Wayne A; Stergiou, Nicholas

    2009-01-01

    Background By quantifying the information entropy of postural sway data, the complexity of the postural movement of different populations can be assessed, giving insight into pathologic motor control functioning. Methods In this study, developmental delay of motor control function in infants was assessed by analysis of sitting postural sway data acquired from force plate center of pressure measurements. Two types of entropy measures were used: symbolic entropy, including a new asymmetric symbolic entropy measure, and approximate entropy, a more widely used entropy measure. For each method of analysis, parameters were adjusted to optimize the separation of the results from the infants with delayed development from infants with typical development. Results The method that gave the widest separation between the populations was the asymmetric symbolic entropy method, which we developed by modification of the symbolic entropy algorithm. The approximate entropy algorithm also performed well, using parameters optimized for the infant sitting data. The infants with delayed development were found to have less complex patterns of postural sway in the medial-lateral direction, and were found to have different left-right symmetry in their postural sway, as compared to typically developing infants. Conclusion The results of this study indicate that optimization of the entropy algorithm for infant sitting postural sway data can greatly improve the ability to separate the infants with developmental delay from typically developing infants. PMID:19671183

  2. Extensions and applications of a second-order landsurface parameterization

    NASA Technical Reports Server (NTRS)

    Andreou, S. A.; Eagleson, P. S.

    1983-01-01

    Extensions and applications of a second order land surface parameterization, proposed by Andreou and Eagleson are developed. Procedures for evaluating the near surface storage depth used in one cell land surface parameterizations are suggested and tested by using the model. Sensitivity analysis to the key soil parameters is performed. A case study involving comparison with an "exact" numerical model and another simplified parameterization, under very dry climatic conditions and for two different soil types, is also incorporated.

  3. Entropy Analyses of Four Familiar Processes.

    ERIC Educational Resources Information Center

    Craig, Norman C.

    1988-01-01

    Presents entropy analysis of four processes: a chemical reaction, a heat engine, the dissolution of a solid, and osmosis. Discusses entropy, the second law of thermodynamics, and the Gibbs free energy function. (MVL)

  4. Beyond the classical theory of heat conduction: a perspective view of future from entropy

    PubMed Central

    Lai, Xiang; Zhu, Pingan

    2016-01-01

    Energy is conserved by the first law of thermodynamics; its quality degrades constantly due to entropy generation, by the second law of thermodynamics. It is thus important to examine the entropy generation regarding the way to reduce its magnitude and the limit of entropy generation as time tends to infinity regarding whether it is bounded or not. This work initiates such an analysis with one-dimensional heat conduction. The work not only offers some fundamental insights of universe and its future, but also builds up the relation between the second law of thermodynamics and mathematical inequalities via developing the latter of either new or classical nature. A concise review of entropy is also included for the interest of performing the analysis in this work and the similar analysis for other processes in the future. PMID:27843400

  5. Analysis of sensitivity to different parameterization schemes for a subtropical cyclone

    NASA Astrophysics Data System (ADS)

    Quitián-Hernández, L.; Fernández-González, S.; González-Alemán, J. J.; Valero, F.; Martín, M. L.

    2018-05-01

    A sensitivity analysis to diverse WRF model physical parameterization schemes is carried out during the lifecycle of a Subtropical cyclone (STC). STCs are low-pressure systems that share tropical and extratropical characteristics, with hybrid thermal structures. In October 2014, a STC made landfall in the Canary Islands, causing widespread damage from strong winds and precipitation there. The system began to develop on October 18 and its effects lasted until October 21. Accurate simulation of this type of cyclone continues to be a major challenge because of its rapid intensification and unique characteristics. In the present study, several numerical simulations were performed using the WRF model to do a sensitivity analysis of its various parameterization schemes for the development and intensification of the STC. The combination of parameterization schemes that best simulated this type of phenomenon was thereby determined. In particular, the parameterization combinations that included the Tiedtke cumulus schemes had the most positive effects on model results. Moreover, concerning STC track validation, optimal results were attained when the STC was fully formed and all convective processes stabilized. Furthermore, to obtain the parameterization schemes that optimally categorize STC structure, a verification using Cyclone Phase Space is assessed. Consequently, the combination of parameterizations including the Tiedtke cumulus schemes were again the best in categorizing the cyclone's subtropical structure. For strength validation, related atmospheric variables such as wind speed and precipitable water were analyzed. Finally, the effects of using a deterministic or probabilistic approach in simulating intense convective phenomena were evaluated.

  6. Entropy generation across Earth's collisionless bow shock.

    PubMed

    Parks, G K; Lee, E; McCarthy, M; Goldstein, M; Fu, S Y; Cao, J B; Canu, P; Lin, N; Wilber, M; Dandouras, I; Réme, H; Fazakerley, A

    2012-02-10

    Earth's bow shock is a collisionless shock wave but entropy has never been directly measured across it. The plasma experiments on Cluster and Double Star measure 3D plasma distributions upstream and downstream of the bow shock allowing calculation of Boltzmann's entropy function H and his famous H theorem, dH/dt≤0. The collisionless Boltzmann (Vlasov) equation predicts that the total entropy does not change if the distribution function across the shock becomes nonthermal, but it allows changes in the entropy density. Here, we present the first direct measurements of entropy density changes across Earth's bow shock and show that the results generally support the model of the Vlasov analysis. These observations are a starting point for a more sophisticated analysis that includes 3D computer modeling of collisionless shocks with input from observed particles, waves, and turbulences.

  7. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  8. Extension and Application of High-Speed Digital Imaging Analysis Via Spatiotemporal Correlation and Eigenmode Analysis of Vocal Fold Vibration Before and After Polyp Excision.

    PubMed

    Wang, Jun-Sheng; Olszewski, Emily; Devine, Erin E; Hoffman, Matthew R; Zhang, Yu; Shao, Jun; Jiang, Jack J

    2016-08-01

    To evaluate the spatiotemporal correlation of vocal fold vibration using eigenmode analysis before and after polyp removal and explore the potential clinical relevance of spatiotemporal analysis of correlation length and entropy as quantitative voice parameters. We hypothesized that increased order in the vibrating signal after surgical intervention would decrease the eigenmode-based entropy and increase correlation length. Prospective case series. Forty subjects (23 males, 17 females) with unilateral (n = 24) or bilateral (n = 16) polyps underwent polyp removal. High-speed videoendoscopy was performed preoperatively and 2 weeks postoperatively. Spatiotemporal analysis was performed to determine entropy, quantification of signal disorder, correlation length, size, and spatially ordered structure of vocal fold vibration in comparison to full spatial consistency. The signal analyzed consists of the vibratory pattern in space and time derived from the high-speed video glottal area contour. Entropy decreased (Z = -3.871, P < .001) and correlation length increased (t = -8.913, P < .001) following polyp excision. The intraclass correlation coefficients (ICC) for correlation length and entropy were 0.84 and 0.93. Correlation length and entropy are sensitive to mass lesions. These parameters could potentially be used to augment subjective visualization after polyp excision when evaluating procedural efficacy. © The Author(s) 2016.

  9. A Study of Turkish Chemistry Undergraduates' Understandings of Entropy

    ERIC Educational Resources Information Center

    Sozbilir, Mustafa; Bennett, Judith M.

    2007-01-01

    Entropy is that fundamental concept of chemical thermodynamics, which explains the natural tendency of matter and energy in the Universe. The analysis presents the description of entropy, as understood by the Turkish chemistry undergraduates.

  10. State fusion entropy for continuous and site-specific analysis of landslide stability changing regularities

    NASA Astrophysics Data System (ADS)

    Liu, Yong; Qin, Zhimeng; Hu, Baodan; Feng, Shuai

    2018-04-01

    Stability analysis is of great significance to landslide hazard prevention, especially the dynamic stability. However, many existing stability analysis methods are difficult to analyse the continuous landslide stability and its changing regularities in a uniform criterion due to the unique landslide geological conditions. Based on the relationship between displacement monitoring data, deformation states and landslide stability, a state fusion entropy method is herein proposed to derive landslide instability through a comprehensive multi-attribute entropy analysis of deformation states, which are defined by a proposed joint clustering method combining K-means and a cloud model. Taking Xintan landslide as the detailed case study, cumulative state fusion entropy presents an obvious increasing trend after the landslide entered accelerative deformation stage and historical maxima match highly with landslide macroscopic deformation behaviours in key time nodes. Reasonable results are also obtained in its application to several other landslides in the Three Gorges Reservoir in China. Combined with field survey, state fusion entropy may serve for assessing landslide stability and judging landslide evolutionary stages.

  11. Effects of Land Surface Heterogeneity on Simulated Boundary-Layer Structures from the LES to the Mesoscale

    NASA Astrophysics Data System (ADS)

    Poll, Stefan; Shrestha, Prabhakar; Simmer, Clemens

    2017-04-01

    Land heterogeneity influences the atmospheric boundary layer (ABL) structure including organized (secondary) circulations which feed back on land-atmosphere exchange fluxes. Especially the latter effects cannot be incorporated explicitly in regional and climate models due to their coarse computational spatial grids, but must be parameterized. Current parameterizations lead, however, to uncertainties in modeled surface fluxes and boundary layer evolution, which feed back to cloud initiation and precipitation. This study analyzes the impact of different horizontal grid resolutions on the simulated boundary layer structures in terms of stability, height and induced secondary circulations. The ICON-LES (Icosahedral Nonhydrostatic in LES mode) developed by the MPI-M and the German weather service (DWD) and conducted within the framework of HD(CP)2 is used. ICON is dynamically downscaled through multiple scales of 20 km, 7 km, 2.8 km, 625 m, 312 m, and 156 m grid spacing for several days over Germany and partial neighboring countries for different synoptic conditions. We examined the entropy spectrum of the land surface heterogeneity at these grid resolutions for several locations close to measurement sites, such as Lindenberg, Jülich, Cabauw and Melpitz, and studied its influence on the surface fluxes and the evolution of the boundary layer profiles.

  12. Generalized sample entropy analysis for traffic signals based on similarity measure

    NASA Astrophysics Data System (ADS)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  13. Whys and Hows of the Parameterized Interval Analyses: A Guide for the Perplexed

    NASA Astrophysics Data System (ADS)

    Elishakoff, I.

    2013-10-01

    Novel elements of the parameterized interval analysis developed in [1, 2] are emphasized in this response, to Professor E.D. Popova, or possibly to others who may be perplexed by the parameterized interval analysis. It is also shown that the overwhelming majority of comments by Popova [3] are based on a misreading of our paper [1]. Partial responsibility for this misreading can be attributed to the fact that explanations provided in [1] were laconic. These could have been more extensive in view of the novelty of our approach [1, 2]. It is our duty, therefore, to reiterate, in this response, the whys and hows of parameterization of intervals, introduced in [1] to incorporate the possibly available information on dependencies between various intervals describing the problem at hand. This possibility appears to have been discarded by the standard interval analysis, which may, as a result, lead to overdesign, leading to the possible divorce of engineers from the otherwise beautiful interval analysis.

  14. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    NASA Astrophysics Data System (ADS)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;

  15. Sample entropy analysis of cervical neoplasia gene-expression signatures

    PubMed Central

    Botting, Shaleen K; Trzeciakowski, Jerome P; Benoit, Michelle F; Salama, Salama A; Diaz-Arrastia, Concepcion R

    2009-01-01

    Background We introduce Approximate Entropy as a mathematical method of analysis for microarray data. Approximate entropy is applied here as a method to classify the complex gene expression patterns resultant of a clinical sample set. Since Entropy is a measure of disorder in a system, we believe that by choosing genes which display minimum entropy in normal controls and maximum entropy in the cancerous sample set we will be able to distinguish those genes which display the greatest variability in the cancerous set. Here we describe a method of utilizing Approximate Sample Entropy (ApSE) analysis to identify genes of interest with the highest probability of producing an accurate, predictive, classification model from our data set. Results In the development of a diagnostic gene-expression profile for cervical intraepithelial neoplasia (CIN) and squamous cell carcinoma of the cervix, we identified 208 genes which are unchanging in all normal tissue samples, yet exhibit a random pattern indicative of the genetic instability and heterogeneity of malignant cells. This may be measured in terms of the ApSE when compared to normal tissue. We have validated 10 of these genes on 10 Normal and 20 cancer and CIN3 samples. We report that the predictive value of the sample entropy calculation for these 10 genes of interest is promising (75% sensitivity, 80% specificity for prediction of cervical cancer over CIN3). Conclusion The success of the Approximate Sample Entropy approach in discerning alterations in complexity from biological system with such relatively small sample set, and extracting biologically relevant genes of interest hold great promise. PMID:19232110

  16. Entropy Analysis of Kinetic Flux Vector Splitting Schemes for the Compressible Euler Equations

    NASA Technical Reports Server (NTRS)

    Shiuhong, Lui; Xu, Jun

    1999-01-01

    Flux Vector Splitting (FVS) scheme is one group of approximate Riemann solvers for the compressible Euler equations. In this paper, the discretized entropy condition of the Kinetic Flux Vector Splitting (KFVS) scheme based on the gas-kinetic theory is proved. The proof of the entropy condition involves the entropy definition difference between the distinguishable and indistinguishable particles.

  17. Multiscale Shannon entropy and its application in the stock market

    NASA Astrophysics Data System (ADS)

    Gu, Rongbao

    2017-10-01

    In this paper, we perform a multiscale entropy analysis on the Dow Jones Industrial Average Index using the Shannon entropy. The stock index shows the characteristic of multi-scale entropy that caused by noise in the market. The entropy is demonstrated to have significant predictive ability for the stock index in both long-term and short-term, and empirical results verify that noise does exist in the market and can affect stock price. It has important implications on market participants such as noise traders.

  18. Hydraulic Conductivity Estimation using Bayesian Model Averaging and Generalized Parameterization

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Li, X.

    2006-12-01

    Non-uniqueness in parameterization scheme is an inherent problem in groundwater inverse modeling due to limited data. To cope with the non-uniqueness problem of parameterization, we introduce a Bayesian Model Averaging (BMA) method to integrate a set of selected parameterization methods. The estimation uncertainty in BMA includes the uncertainty in individual parameterization methods as the within-parameterization variance and the uncertainty from using different parameterization methods as the between-parameterization variance. Moreover, the generalized parameterization (GP) method is considered in the geostatistical framework in this study. The GP method aims at increasing the flexibility of parameterization through the combination of a zonation structure and an interpolation method. The use of BMP with GP avoids over-confidence in a single parameterization method. A normalized least-squares estimation (NLSE) is adopted to calculate the posterior probability for each GP. We employee the adjoint state method for the sensitivity analysis on the weighting coefficients in the GP method. The adjoint state method is also applied to the NLSE problem. The proposed methodology is implemented to the Alamitos Barrier Project (ABP) in California, where the spatially distributed hydraulic conductivity is estimated. The optimal weighting coefficients embedded in GP are identified through the maximum likelihood estimation (MLE) where the misfits between the observed and calculated groundwater heads are minimized. The conditional mean and conditional variance of the estimated hydraulic conductivity distribution using BMA are obtained to assess the estimation uncertainty.

  19. Multiwavelet packet entropy and its application in transmission line fault recognition and classification.

    PubMed

    Liu, Zhigang; Han, Zhiwei; Zhang, Yang; Zhang, Qiaoge

    2014-11-01

    Multiwavelets possess better properties than traditional wavelets. Multiwavelet packet transformation has more high-frequency information. Spectral entropy can be applied as an analysis index to the complexity or uncertainty of a signal. This paper tries to define four multiwavelet packet entropies to extract the features of different transmission line faults, and uses a radial basis function (RBF) neural network to recognize and classify 10 fault types of power transmission lines. First, the preprocessing and postprocessing problems of multiwavelets are presented. Shannon entropy and Tsallis entropy are introduced, and their difference is discussed. Second, multiwavelet packet energy entropy, time entropy, Shannon singular entropy, and Tsallis singular entropy are defined as the feature extraction methods of transmission line fault signals. Third, the plan of transmission line fault recognition using multiwavelet packet entropies and an RBF neural network is proposed. Finally, the experimental results show that the plan with the four multiwavelet packet energy entropies defined in this paper achieves better performance in fault recognition. The performance with SA4 (symmetric antisymmetric) multiwavelet packet Tsallis singular entropy is the best among the combinations of different multiwavelet packets and the four multiwavelet packet entropies.

  20. Maximum entropy production principle for geostrophic turbulence

    NASA Astrophysics Data System (ADS)

    Sommeria, J.; Bouchet, F.; Chavanis, P. H.

    2003-04-01

    In 2D turbulence, complex stirring leads to the formation of steady organized states, once fine scale fluctuations have been filtered out. This self-organization can be explained in terms of statistical equilibrium for vorticity, as the most likely outcome of vorticity parcel rearrangements with the constraints of the conservation laws. A mixing entropy describing the vorticity rearrangements is introduced. Extension to the shallow water system has been proposed by Chavanis P.H. and Sommeria J. (2002), Phys. Rev. E. Generalization to multi-layer geostrophic flows is formally straightforward. Outside equilibrium, eddy fluxes should drive the system toward equilibrium, in the spirit of non equilibrium linear thermodynamics. This can been formalized in terms of a principle of maximum entropy production (MEP), as shown by Robert and Sommeria (1991), Phys. Rev. Lett. 69. Then a parameterization of eddy fluxes is obtained, involving an eddy diffusivity plus a drift term acting at larger scale. These two terms balance each other at equilibrium, resulting in a non trivial steady flow, which is the mean state of the statistical equilibrium. Applications of this eddy parametrization will be presented, in the context of oceanic circulation and Jupiter's Great Red Spot. Quantitative tests will be discussed, obtained by comparisons with direct numerical simulations. Kinetic models, inspired from plasma physics, provide a more precise description of the relaxation toward equilibrium, as shown by Chavanis P.H. 2000 ``Quasilinear theory of the 2D Euler equation'', Phys. Rev. Lett. 84. This approach provides relaxation equations with a form similar to the MEP, but not identical. In conclusion, the MEP provides the right trends of the system but its precise justification remains elusive.

  1. Entropy of electromyography time series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Zurcher, Ulrich; Sung, Paul S.

    2007-12-01

    A nonlinear analysis based on Renyi entropy is applied to electromyography (EMG) time series from back muscles. The time dependence of the entropy of the EMG signal exhibits a crossover from a subdiffusive regime at short times to a plateau at longer times. We argue that this behavior characterizes complex biological systems. The plateau value of the entropy can be used to differentiate between healthy and low back pain individuals.

  2. Information Entropy Analysis of the H1N1 Genetic Code

    NASA Astrophysics Data System (ADS)

    Martwick, Andy

    2010-03-01

    During the current H1N1 pandemic, viral samples are being obtained from large numbers of infected people world-wide and are being sequenced on the NCBI Influenza Virus Resource Database. The information entropy of the sequences was computed from the probability of occurrence of each nucleotide base at every position of each set of sequences using Shannon's definition of information entropy, [ H=∑bpb,2( 1pb ) ] where H is the observed information entropy at each nucleotide position and pb is the probability of the base pair of the nucleotides A, C, G, U. Information entropy of the current H1N1 pandemic is compared to reference human and swine H1N1 entropy. As expected, the current H1N1 entropy is in a low entropy state and has a very large mutation potential. Using the entropy method in mature genes we can identify low entropy regions of nucleotides that generally correlate to critical protein function.

  3. Whole-Lesion Apparent Diffusion Coefficient-Based Entropy-Related Parameters for Characterizing Cervical Cancers: Initial Findings.

    PubMed

    Guan, Yue; Li, Weifeng; Jiang, Zhuoran; Chen, Ying; Liu, Song; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-12-01

    This study aimed to develop whole-lesion apparent diffusion coefficient (ADC)-based entropy-related parameters of cervical cancer to preliminarily assess intratumoral heterogeneity of this lesion in comparison to adjacent normal cervical tissues. A total of 51 women (mean age, 49 years) with cervical cancers confirmed by biopsy underwent 3-T pelvic diffusion-weighted magnetic resonance imaging with b values of 0 and 800 s/mm 2 prospectively. ADC-based entropy-related parameters including first-order entropy and second-order entropies were derived from the whole tumor volume as well as adjacent normal cervical tissues. Intraclass correlation coefficient, Wilcoxon test with Bonferroni correction, Kruskal-Wallis test, and receiver operating characteristic curve were used for statistical analysis. All the parameters showed excellent interobserver agreement (all intraclass correlation coefficients  > 0.900). Entropy, entropy(H) 0 , entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean were significantly higher, whereas entropy(H) range and entropy(H) std were significantly lower in cervical cancers compared to adjacent normal cervical tissues (all P <.0001). Kruskal-Wallis test showed that there were no significant differences among the values of various second-order entropies including entropy(H) 0, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean. All second-order entropies had larger area under the receiver operating characteristic curve than first-order entropy in differentiating cervical cancers from adjacent normal cervical tissues. Further, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean had the same largest area under the receiver operating characteristic curve of 0.867. Whole-lesion ADC-based entropy-related parameters of cervical cancers were developed successfully, which showed initial potential in characterizing intratumoral heterogeneity in comparison to adjacent normal cervical tissues. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  4. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    NASA Astrophysics Data System (ADS)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency of WWTPs with respect to the state-of-the-art of technology, waste water treatment could become more resources preserving.

  5. RED: a set of molecular descriptors based on Renyi entropy.

    PubMed

    Delgado-Soler, Laura; Toral, Raul; Tomás, M Santos; Rubio-Martinez, Jaime

    2009-11-01

    New molecular descriptors, RED (Renyi entropy descriptors), based on the generalized entropies introduced by Renyi are presented. Topological descriptors based on molecular features have proven to be useful for describing molecular profiles. Renyi entropy is used as a variability measure to contract a feature-pair distribution composing the descriptor vector. The performance of RED descriptors was tested for the analysis of different sets of molecular distances, virtual screening, and pharmacological profiling. A free parameter of the Renyi entropy has been optimized for all the considered applications.

  6. High-Order Entropy Stable Finite Difference Schemes for Nonlinear Conservation Laws: Finite Domains

    NASA Technical Reports Server (NTRS)

    Fisher, Travis C.; Carpenter, Mark H.

    2013-01-01

    Developing stable and robust high-order finite difference schemes requires mathematical formalism and appropriate methods of analysis. In this work, nonlinear entropy stability is used to derive provably stable high-order finite difference methods with formal boundary closures for conservation laws. Particular emphasis is placed on the entropy stability of the compressible Navier-Stokes equations. A newly derived entropy stable weighted essentially non-oscillatory finite difference method is used to simulate problems with shocks and a conservative, entropy stable, narrow-stencil finite difference approach is used to approximate viscous terms.

  7. Differences between state entropy and bispectral index during analysis of identical electroencephalogram signals: a comparison with two randomised anaesthetic techniques.

    PubMed

    Pilge, Stefanie; Kreuzer, Matthias; Karatchiviev, Veliko; Kochs, Eberhard F; Malcharek, Michael; Schneider, Gerhard

    2015-05-01

    It is claimed that bispectral index (BIS) and state entropy reflect an identical clinical spectrum, the hypnotic component of anaesthesia. So far, it is not known to what extent different devices display similar index values while processing identical electroencephalogram (EEG) signals. To compare BIS and state entropy during analysis of identical EEG data. Inspection of raw EEG input to detect potential causes of erroneous index calculation. Offline re-analysis of EEG data from a randomised, single-centre controlled trial using the Entropy Module and an Aspect A-2000 monitor. Klinikum rechts der Isar, Technische Universität München, Munich. Forty adult patients undergoing elective surgery under general anaesthesia. Blocked randomisation of 20 patients per anaesthetic group (sevoflurane/remifentanil or propofol/remifentanil). Isolated forearm technique for differentiation between consciousness and unconsciousness. Prediction probability (PK) of state entropy to discriminate consciousness from unconsciousness. Correlation and agreement between state entropy and BIS from deep to light hypnosis. Analysis of raw EEG compared with index values that are in conflict with clinical examination, with frequency measures (frequency bands/Spectral Edge Frequency 95) and visual inspection for physiological EEG patterns (e.g. beta or delta arousal), pathophysiological features such as high-frequency signals (electromyogram/high-frequency EEG or eye fluttering/saccades), different types of electro-oculogram or epileptiform EEG and technical artefacts. PK of state entropy was 0.80 and of BIS 0.84; correlation coefficient of state entropy with BIS 0.78. Nine percent BIS and 14% state entropy values disagreed with clinical examination. Highest incidence of disagreement occurred after state transitions, in particular for state entropy after loss of consciousness during sevoflurane anaesthesia. EEG sequences which led to false 'conscious' index values often showed high-frequency signals and eye blinks. High-frequency EEG/electromyogram signals were pooled because a separation into EEG and fast electro-oculogram, for example eye fluttering or saccades, on the basis of a single EEG channel may not be very reliable. These signals led to higher Spectral Edge Frequency 95 and ratio of relative beta and gamma band power than EEG signals, indicating adequate unconscious classification. The frequency of other artefacts that were assignable, for example technical artefacts, movement artefacts, was negligible and they were excluded from analysis. High-frequency signals and eye blinks may account for index values that falsely indicate consciousness. Compared with BIS, state entropy showed more false classifications of the clinical state at transition between consciousness and unconsciousness.

  8. Parameters Selection for Bivariate Multiscale Entropy Analysis of Postural Fluctuations in Fallers and Non-Fallers Older Adults.

    PubMed

    Ramdani, Sofiane; Bonnet, Vincent; Tallon, Guillaume; Lagarde, Julien; Bernard, Pierre Louis; Blain, Hubert

    2016-08-01

    Entropy measures are often used to quantify the regularity of postural sway time series. Recent methodological developments provided both multivariate and multiscale approaches allowing the extraction of complexity features from physiological signals; see "Dynamical complexity of human responses: A multivariate data-adaptive framework," in Bulletin of Polish Academy of Science and Technology, vol. 60, p. 433, 2012. The resulting entropy measures are good candidates for the analysis of bivariate postural sway signals exhibiting nonstationarity and multiscale properties. These methods are dependant on several input parameters such as embedding parameters. Using two data sets collected from institutionalized frail older adults, we numerically investigate the behavior of a recent multivariate and multiscale entropy estimator; see "Multivariate multiscale entropy: A tool for complexity analysis of multichannel data," Physics Review E, vol. 84, p. 061918, 2011. We propose criteria for the selection of the input parameters. Using these optimal parameters, we statistically compare the multivariate and multiscale entropy values of postural sway data of non-faller subjects to those of fallers. These two groups are discriminated by the resulting measures over multiple time scales. We also demonstrate that the typical parameter settings proposed in the literature lead to entropy measures that do not distinguish the two groups. This last result confirms the importance of the selection of appropriate input parameters.

  9. Multiscale permutation entropy analysis of electrocardiogram

    NASA Astrophysics Data System (ADS)

    Liu, Tiebing; Yao, Wenpo; Wu, Min; Shi, Zhaorong; Wang, Jun; Ning, Xinbao

    2017-04-01

    To make a comprehensive nonlinear analysis to ECG, multiscale permutation entropy (MPE) was applied to ECG characteristics extraction to make a comprehensive nonlinear analysis of ECG. Three kinds of ECG from PhysioNet database, congestive heart failure (CHF) patients, healthy young and elderly subjects, are applied in this paper. We set embedding dimension to 4 and adjust scale factor from 2 to 100 with a step size of 2, and compare MPE with multiscale entropy (MSE). As increase of scale factor, MPE complexity of the three ECG signals are showing first-decrease and last-increase trends. When scale factor is between 10 and 32, complexities of the three ECG had biggest difference, entropy of the elderly is 0.146 less than the CHF patients and 0.025 larger than the healthy young in average, in line with normal physiological characteristics. Test results showed that MPE can effectively apply in ECG nonlinear analysis, and can effectively distinguish different ECG signals.

  10. A retrospective analysis of the effect of blood transfusion on cerebral oximetry entropy and acute kidney injury.

    PubMed

    Engoren, Milo; Brown, Russell R; Dubovoy, Anna

    2017-01-01

    Acute anemia is associated with both cerebral dysfunction and acute kidney injury and is often treated with red blood cell transfusion. We sought to determine if blood transfusion changed the cerebral oximetry entropy, a measure of the complexity or irregularity of the oximetry values, and if this change was associated with subsequent acute kidney injury. This was a retrospective, case-control study of patients undergoing cardiac surgery with cardiopulmonary bypass at a tertiary care hospital, comparing those who received a red blood cell transfusion to those who did not. Acute kidney injury was defined as a perioperative increase in serum creatinine by ⩾26.4 μmol/L or by ⩾50% increase. Entropy was measured using approximate entropy, sample entropy, forbidden word entropy and basescale4 entropy in 500-point sets. Forty-four transfused patients were matched to 88 randomly selected non-transfused patients. All measures of entropy had small changes in the transfused group, but increased in the non-transfused group (p<0.05, for all comparisons). Thirty-five of 132 patients (27%) suffered acute kidney injury. Based on preoperative factors, patients who suffered kidney injury were similar to those who did not, including baseline cerebral oximetry levels. After analysis with hierarchical logistic regression, the change in basescale4 entropy (odds ratio = 1.609, 95% confidence interval = 1.057-2.450, p = 0.027) and the interaction between basescale entropy and transfusion were significantly associated with subsequent development of acute kidney injury. The transfusion of red blood cells was associated with a smaller rise in entropy values compared to non-transfused patients, suggesting a change in the regulation of cerebral oxygenation, and these changes in cerebral oxygenation are also associated with acute kidney injury.

  11. SUPERMODEL ANALYSIS OF A1246 AND J255: ON THE EVOLUTION OF GALAXY CLUSTERS FROM HIGH TO LOW ENTROPY STATES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fusco-Femiano, R.; Lapi, A., E-mail: roberto.fuscofemiano@iaps.inaf.it

    2015-02-10

    We present an analysis of high-quality X-ray data out to the virial radius for the two galaxy clusters A1246 and GMBCG J255.34805+64.23661 (J255) by means of our entropy-based SuperModel. For A1246 we find that the spherically averaged entropy profile of the intracluster medium (ICM) progressively flattens outward, and that a nonthermal pressure component amounting to ≈20% of the total is required to support hydrostatic equilibrium in the outskirts; there we also estimate a modest value C ≈ 1.6 of the ICM clumping factor. These findings agree with previous analyses on other cool-core, relaxed clusters, and lend further support to themore » picture by Lapi et al. that relates the entropy flattening, the development of the nonthermal pressure component, and the azimuthal variation of ICM properties to weakening boundary shocks. In this scenario clusters are born in a high-entropy state throughout, and are expected to develop on similar timescales a low-entropy state both at the center due to cooling, and in the outskirts due to weakening shocks. However, the analysis of J255 testifies how such a typical evolutionary course can be interrupted or even reversed by merging especially at intermediate redshift, as predicted by Cavaliere et al. In fact, a merger has rejuvenated the ICM of this cluster at z ≈ 0.45 by reestablishing a high-entropy state in the outskirts, while leaving intact or erasing only partially the low-entropy, cool core at the center.« less

  12. ECG contamination of EEG signals: effect on entropy.

    PubMed

    Chakrabarti, Dhritiman; Bansal, Sonia

    2016-02-01

    Entropy™ is a proprietary algorithm which uses spectral entropy analysis of electroencephalographic (EEG) signals to produce indices which are used as a measure of depth of hypnosis. We describe a report of electrocardiographic (ECG) contamination of EEG signals leading to fluctuating erroneous Entropy values. An explanation is provided for mechanism behind this observation by describing the spread of ECG signals in head and neck and its influence on EEG/Entropy by correlating the observation with the published Entropy algorithm. While the Entropy algorithm has been well conceived, there are still instances in which it can produce erroneous values. Such erroneous values and their cause may be identified by close scrutiny of the EEG waveform if Entropy values seem out of sync with that expected at given anaesthetic levels.

  13. Entropy change of biological dynamics in COPD.

    PubMed

    Jin, Yu; Chen, Chang; Cao, Zhixin; Sun, Baoqing; Lo, Iek Long; Liu, Tzu-Ming; Zheng, Jun; Sun, Shixue; Shi, Yan; Zhang, Xiaohua Douglas

    2017-01-01

    In this century, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of large amount of data in human physiological signals. Entropy is a key metric for quantifying the irregularity contained in physiological signals. In this review, we focus on how entropy changes in various physiological signals in COPD. Our review concludes that the entropy change relies on the types of physiological signals under investigation. For major physiological signals related to respiratory diseases, such as airflow, heart rate variability, and gait variability, the entropy of a patient with COPD is lower than that of a healthy person. However, in case of hormone secretion and respiratory sound, the entropy of a patient is higher than that of a healthy person. For mechanomyogram signal, the entropy increases with the increased severity of COPD. This result should give valuable guidance for the use of entropy for physiological signals measured by wearable medical device as well as for further research on entropy in COPD.

  14. The Dynameomics Entropy Dictionary: A Large-Scale Assessment of Conformational Entropy across Protein Fold Space.

    PubMed

    Towse, Clare-Louise; Akke, Mikael; Daggett, Valerie

    2017-04-27

    Molecular dynamics (MD) simulations contain considerable information with regard to the motions and fluctuations of a protein, the magnitude of which can be used to estimate conformational entropy. Here we survey conformational entropy across protein fold space using the Dynameomics database, which represents the largest existing data set of protein MD simulations for representatives of essentially all known protein folds. We provide an overview of MD-derived entropies accounting for all possible degrees of dihedral freedom on an unprecedented scale. Although different side chains might be expected to impose varying restrictions on the conformational space that the backbone can sample, we found that the backbone entropy and side chain size are not strictly coupled. An outcome of these analyses is the Dynameomics Entropy Dictionary, the contents of which have been compared with entropies derived by other theoretical approaches and experiment. As might be expected, the conformational entropies scale linearly with the number of residues, demonstrating that conformational entropy is an extensive property of proteins. The calculated conformational entropies of folding agree well with previous estimates. Detailed analysis of specific cases identifies deviations in conformational entropy from the average values that highlight how conformational entropy varies with sequence, secondary structure, and tertiary fold. Notably, α-helices have lower entropy on average than do β-sheets, and both are lower than coil regions.

  15. Parameterization of the Van Hove dynamic self-scattering law Ss(Q,omega)

    NASA Astrophysics Data System (ADS)

    Zetterstrom, P.

    In this paper we present a model of the Van Hove dynamic scattering law SME(Q, omega) based on the maximum entropy principle which is developed for the first time. The model is aimed to be used in the calculation of inelastic corrections to neutron diffraction data. The model is constrained by the first and second frequency moments and detailed balance, but can be expanded to an arbitrary number of frequency moments. The second moment can be varied by an effective temperature to account for the kinetic energy of the atoms. The results are compared with a diffusion model of the scattering law. Finally some calculations of the inelastic self-scattering for a time-of-flight diffractometer are presented. From this we show that the inelastic self-scattering is very sensitive to the details of the dynamic scattering law.

  16. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kaixuan; Wang, Jun

    2017-02-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.

  17. Entropy production in a box: Analysis of instabilities in confined hydrothermal systems

    NASA Astrophysics Data System (ADS)

    Börsing, N.; Wellmann, J. F.; Niederau, J.; Regenauer-Lieb, K.

    2017-09-01

    We evaluate if the concept of thermal entropy production can be used as a measure to characterize hydrothermal convection in a confined porous medium as a valuable, thermodynamically motivated addition to the standard Rayleigh number analysis. Entropy production has been used widely in the field of mechanical and chemical engineering as a way to characterize the thermodynamic state and irreversibility of an investigated system. Pioneering studies have since adapted these concepts to natural systems, and we apply this measure here to investigate the specific case of hydrothermal convection in a "box-shaped" confined porous medium, as a simplified analog for, e.g., hydrothermal convection in deep geothermal aquifers. We perform various detailed numerical experiments to assess the response of the convective system to changing boundary conditions or domain aspect ratios, and then determine the resulting entropy production for each experiment. In systems close to the critical Rayleigh number, we derive results that are in accordance to the analytically derived predictions. At higher Rayleigh numbers, however, we observe multiple possible convection modes, and the analysis of the integrated entropy production reveals distinct curves of entropy production that provide an insight into the hydrothermal behavior in the system, both for cases of homogeneous materials, as well as for heterogeneous spatial material distributions. We conclude that the average thermal entropy production characterizes the internal behavior of hydrothermal systems with a meaningful thermodynamic measure, and we expect that it can be useful for the investigation of convection systems in many similar hydrogeological and geophysical settings.

  18. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  19. Multi-scale symbolic transfer entropy analysis of EEG

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-10-01

    From both global and local perspectives, we symbolize two kinds of EEG and analyze their dynamic and asymmetrical information using multi-scale transfer entropy. Multi-scale process with scale factor from 1 to 199 and step size of 2 is applied to EEG of healthy people and epileptic patients, and then the permutation with embedding dimension of 3 and global approach are used to symbolize the sequences. The forward and reverse symbol sequences are taken as the inputs of transfer entropy. Scale factor intervals of permutation and global way are (37, 57) and (65, 85) where the two kinds of EEG have satisfied entropy distinctions. When scale factor is 67, transfer entropy of the healthy and epileptic subjects of permutation, 0.1137 and 0.1028, have biggest difference. And the corresponding values of the global symbolization is 0.0641 and 0.0601 which lies in the scale factor of 165. Research results show that permutation which takes contribution of local information has better distinction and is more effectively applied to our multi-scale transfer entropy analysis of EEG.

  20. Entropy production and nonlinear Fokker-Planck equations.

    PubMed

    Casas, G A; Nobre, F D; Curado, E M F

    2012-12-01

    The entropy time rate of systems described by nonlinear Fokker-Planck equations--which are directly related to generalized entropic forms--is analyzed. Both entropy production, associated with irreversible processes, and entropy flux from the system to its surroundings are studied. Some examples of known generalized entropic forms are considered, and particularly, the flux and production of the Boltzmann-Gibbs entropy, obtained from the linear Fokker-Planck equation, are recovered as particular cases. Since nonlinear Fokker-Planck equations are appropriate for the dynamical behavior of several physical phenomena in nature, like many within the realm of complex systems, the present analysis should be applicable to irreversible processes in a large class of nonlinear systems, such as those described by Tsallis and Kaniadakis entropies.

  1. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  2. Distribution entropy analysis of epileptic EEG signals.

    PubMed

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-01-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.

  3. Consistent maximum entropy representations of pipe flow networks

    NASA Astrophysics Data System (ADS)

    Waldrip, Steven H.; Niven, Robert K.; Abel, Markus; Schlegel, Michael

    2017-06-01

    The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.

  4. Entropy change of biological dynamics in COPD

    PubMed Central

    Cao, Zhixin; Sun, Baoqing; Lo, Iek Long; Liu, Tzu-Ming; Zheng, Jun; Sun, Shixue; Shi, Yan; Zhang, Xiaohua Douglas

    2017-01-01

    In this century, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of large amount of data in human physiological signals. Entropy is a key metric for quantifying the irregularity contained in physiological signals. In this review, we focus on how entropy changes in various physiological signals in COPD. Our review concludes that the entropy change relies on the types of physiological signals under investigation. For major physiological signals related to respiratory diseases, such as airflow, heart rate variability, and gait variability, the entropy of a patient with COPD is lower than that of a healthy person. However, in case of hormone secretion and respiratory sound, the entropy of a patient is higher than that of a healthy person. For mechanomyogram signal, the entropy increases with the increased severity of COPD. This result should give valuable guidance for the use of entropy for physiological signals measured by wearable medical device as well as for further research on entropy in COPD. PMID:29066881

  5. How long the singular value decomposed entropy predicts the stock market? - Evidence from the Dow Jones Industrial Average Index

    NASA Astrophysics Data System (ADS)

    Gu, Rongbao; Shao, Yanmin

    2016-07-01

    In this paper, a new concept of multi-scales singular value decomposition entropy based on DCCA cross correlation analysis is proposed and its predictive power for the Dow Jones Industrial Average Index is studied. Using Granger causality analysis with different time scales, it is found that, the singular value decomposition entropy has predictive power for the Dow Jones Industrial Average Index for period less than one month, but not for more than one month. This shows how long the singular value decomposition entropy predicts the stock market that extends Caraiani's result obtained in Caraiani (2014). On the other hand, the result also shows an essential characteristic of stock market as a chaotic dynamic system.

  6. The maximum entropy production and maximum Shannon information entropy in enzyme kinetics

    NASA Astrophysics Data System (ADS)

    Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš

    2018-04-01

    We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.

  7. On entropy, financial markets and minority games

    NASA Astrophysics Data System (ADS)

    Zapart, Christopher A.

    2009-04-01

    The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.

  8. Inverting Monotonic Nonlinearities by Entropy Maximization

    PubMed Central

    López-de-Ipiña Pena, Karmele; Caiafa, Cesar F.

    2016-01-01

    This paper proposes a new method for blind inversion of a monotonic nonlinear map applied to a sum of random variables. Such kinds of mixtures of random variables are found in source separation and Wiener system inversion problems, for example. The importance of our proposed method is based on the fact that it permits to decouple the estimation of the nonlinear part (nonlinear compensation) from the estimation of the linear one (source separation matrix or deconvolution filter), which can be solved by applying any convenient linear algorithm. Our new nonlinear compensation algorithm, the MaxEnt algorithm, generalizes the idea of Gaussianization of the observation by maximizing its entropy instead. We developed two versions of our algorithm based either in a polynomial or a neural network parameterization of the nonlinear function. We provide a sufficient condition on the nonlinear function and the probability distribution that gives a guarantee for the MaxEnt method to succeed compensating the distortion. Through an extensive set of simulations, MaxEnt is compared with existing algorithms for blind approximation of nonlinear maps. Experiments show that MaxEnt is able to successfully compensate monotonic distortions outperforming other methods in terms of the obtained Signal to Noise Ratio in many important cases, for example when the number of variables in a mixture is small. Besides its ability for compensating nonlinearities, MaxEnt is very robust, i.e. showing small variability in the results. PMID:27780261

  9. Inverting Monotonic Nonlinearities by Entropy Maximization.

    PubMed

    Solé-Casals, Jordi; López-de-Ipiña Pena, Karmele; Caiafa, Cesar F

    2016-01-01

    This paper proposes a new method for blind inversion of a monotonic nonlinear map applied to a sum of random variables. Such kinds of mixtures of random variables are found in source separation and Wiener system inversion problems, for example. The importance of our proposed method is based on the fact that it permits to decouple the estimation of the nonlinear part (nonlinear compensation) from the estimation of the linear one (source separation matrix or deconvolution filter), which can be solved by applying any convenient linear algorithm. Our new nonlinear compensation algorithm, the MaxEnt algorithm, generalizes the idea of Gaussianization of the observation by maximizing its entropy instead. We developed two versions of our algorithm based either in a polynomial or a neural network parameterization of the nonlinear function. We provide a sufficient condition on the nonlinear function and the probability distribution that gives a guarantee for the MaxEnt method to succeed compensating the distortion. Through an extensive set of simulations, MaxEnt is compared with existing algorithms for blind approximation of nonlinear maps. Experiments show that MaxEnt is able to successfully compensate monotonic distortions outperforming other methods in terms of the obtained Signal to Noise Ratio in many important cases, for example when the number of variables in a mixture is small. Besides its ability for compensating nonlinearities, MaxEnt is very robust, i.e. showing small variability in the results.

  10. An Evaluation of Lightning Flash Rate Parameterizations Based on Observations of Colorado Storms during DC3

    NASA Astrophysics Data System (ADS)

    Basarab, B.; Fuchs, B.; Rutledge, S. A.

    2013-12-01

    Predicting lightning activity in thunderstorms is important in order to accurately quantify the production of nitrogen oxides (NOx = NO + NO2) by lightning (LNOx). Lightning is an important global source of NOx, and since NOx is a chemical precursor to ozone, the climatological impacts of LNOx could be significant. Many cloud-resolving models rely on parameterizations to predict lightning and LNOx since the processes leading to charge separation and lightning discharge are not yet fully understood. This study evaluates predicted flash rates based on existing lightning parameterizations against flash rates observed for Colorado storms during the Deep Convective Clouds and Chemistry Experiment (DC3). Evaluating lightning parameterizations against storm observations is a useful way to possibly improve the prediction of flash rates and LNOx in models. Additionally, since convective storms that form in the eastern plains of Colorado can be different thermodynamically and electrically from storms in other regions, it is useful to test existing parameterizations against observations from these storms. We present an analysis of the dynamics, microphysics, and lightning characteristics of two case studies, severe storms that developed on 6 and 7 June 2012. This analysis includes dual-Doppler derived horizontal and vertical velocities, a hydrometeor identification based on polarimetric radar variables using the CSU-CHILL radar, and insight into the charge structure using observations from the northern Colorado Lightning Mapping Array (LMA). Flash rates were inferred from the LMA data using a flash counting algorithm. We have calculated various microphysical and dynamical parameters for these storms that have been used in empirical flash rate parameterizations. In particular, maximum vertical velocity has been used to predict flash rates in some cloud-resolving chemistry simulations. We diagnose flash rates for the 6 and 7 June storms using this parameterization and compare to observed flash rates. For the 6 June storm, a preliminary analysis of aircraft observations of storm inflow and outflow is presented in order to place flash rates (and other lightning statistics) in the context of storm chemistry. An approach to a possibly improved LNOx parameterization scheme using different lightning metrics such as flash area will be discussed.

  11. A new fractional snow-covered area parameterization for the Community Land Model and its effect on the surface energy balance

    NASA Astrophysics Data System (ADS)

    Swenson, S. C.; Lawrence, D. M.

    2011-11-01

    One function of the Community Land Model (CLM4) is the determination of surface albedo in the Community Earth System Model (CESM1). Because the typical spatial scales of CESM1 simulations are large compared to the scales of variability of surface properties such as snow cover and vegetation, unresolved surface heterogeneity is parameterized. Fractional snow-covered area, or snow-covered fraction (SCF), within a CLM4 grid cell is parameterized as a function of grid cell mean snow depth and snow density. This parameterization is based on an analysis of monthly averaged SCF and snow depth that showed a seasonal shift in the snow depth-SCF relationship. In this paper, we show that this shift is an artifact of the monthly sampling and that the current parameterization does not reflect the relationship observed between snow depth and SCF at the daily time scale. We demonstrate that the snow depth analysis used in the original study exhibits a bias toward early melt when compared to satellite-observed SCF. This bias results in a tendency to overestimate SCF as a function of snow depth. Using a more consistent, higher spatial and temporal resolution snow depth analysis reveals a clear hysteresis between snow accumulation and melt seasons. Here, a new SCF parameterization based on snow water equivalent is developed to capture the observed seasonal snow depth-SCF evolution. The effects of the new SCF parameterization on the surface energy budget are described. In CLM4, surface energy fluxes are calculated assuming a uniform snow cover. To more realistically simulate environments having patchy snow cover, we modify the model by computing the surface fluxes separately for snow-free and snow-covered fractions of a grid cell. In this configuration, the form of the parameterized snow depth-SCF relationship is shown to greatly affect the surface energy budget. The direct exposure of the snow-free surfaces to the atmosphere leads to greater heat loss from the ground during autumn and greater heat gain during spring. The net effect is to reduce annual mean soil temperatures by up to 3°C in snow-affected regions.

  12. A new fractional snow-covered area parameterization for the Community Land Model and its effect on the surface energy balance

    NASA Astrophysics Data System (ADS)

    Swenson, S. C.; Lawrence, D. M.

    2012-11-01

    One function of the Community Land Model (CLM4) is the determination of surface albedo in the Community Earth System Model (CESM1). Because the typical spatial scales of CESM1 simulations are large compared to the scales of variability of surface properties such as snow cover and vegetation, unresolved surface heterogeneity is parameterized. Fractional snow-covered area, or snow-covered fraction (SCF), within a CLM4 grid cell is parameterized as a function of grid cell mean snow depth and snow density. This parameterization is based on an analysis of monthly averaged SCF and snow depth that showed a seasonal shift in the snow depth-SCF relationship. In this paper, we show that this shift is an artifact of the monthly sampling and that the current parameterization does not reflect the relationship observed between snow depth and SCF at the daily time scale. We demonstrate that the snow depth analysis used in the original study exhibits a bias toward early melt when compared to satellite-observed SCF. This bias results in a tendency to overestimate SCF as a function of snow depth. Using a more consistent, higher spatial and temporal resolution snow depth analysis reveals a clear hysteresis between snow accumulation and melt seasons. Here, a new SCF parameterization based on snow water equivalent is developed to capture the observed seasonal snow depth-SCF evolution. The effects of the new SCF parameterization on the surface energy budget are described. In CLM4, surface energy fluxes are calculated assuming a uniform snow cover. To more realistically simulate environments having patchy snow cover, we modify the model by computing the surface fluxes separately for snow-free and snow-covered fractions of a grid cell. In this configuration, the form of the parameterized snow depth-SCF relationship is shown to greatly affect the surface energy budget. The direct exposure of the snow-free surfaces to the atmosphere leads to greater heat loss from the ground during autumn and greater heat gain during spring. The net effect is to reduce annual mean soil temperatures by up to 3°C in snow-affected regions.

  13. Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy

    NASA Astrophysics Data System (ADS)

    Dun, Xiaohong

    2018-05-01

    With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.

  14. Analysis of HD 73045 light curve data

    NASA Astrophysics Data System (ADS)

    Das, Mrinal Kanti; Bhatraju, Naveen Kumar; Joshi, Santosh

    2018-04-01

    In this work we analyzed the Kepler light curve data of HD 73045. The raw data has been smoothened using standard filters. The power spectrum has been obtained by using a fast Fourier transform routine. It shows the presence of more than one period. In order to take care of any non-stationary behavior, we carried out a wavelet analysis to obtain the wavelet power spectrum. In addition, to identify the scale invariant structure, the data has been analyzed using a multifractal detrended fluctuation analysis. Further to characterize the diversity of embedded patterns in the HD 73045 flux time series, we computed various entropy-based complexity measures e.g. sample entropy, spectral entropy and permutation entropy. The presence of periodic structure in the time series was further analyzed using the visibility network and horizontal visibility network model of the time series. The degree distributions in the two network models confirm such structures.

  15. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  16. The increase of the functional entropy of the human brain with age.

    PubMed

    Yao, Y; Lu, W L; Xu, B; Li, C B; Lin, C P; Waxman, D; Feng, J F

    2013-10-09

    We use entropy to characterize intrinsic ageing properties of the human brain. Analysis of fMRI data from a large dataset of individuals, using resting state BOLD signals, demonstrated that a functional entropy associated with brain activity increases with age. During an average lifespan, the entropy, which was calculated from a population of individuals, increased by approximately 0.1 bits, due to correlations in BOLD activity becoming more widely distributed. We attribute this to the number of excitatory neurons and the excitatory conductance decreasing with age. Incorporating these properties into a computational model leads to quantitatively similar results to the fMRI data. Our dataset involved males and females and we found significant differences between them. The entropy of males at birth was lower than that of females. However, the entropies of the two sexes increase at different rates, and intersect at approximately 50 years; after this age, males have a larger entropy.

  17. Increased resting-state brain entropy in Alzheimer's disease.

    PubMed

    Xue, Shao-Wei; Guo, Yonghu

    2018-03-07

    Entropy analysis of resting-state functional MRI (R-fMRI) is a novel approach to characterize brain temporal dynamics and facilitates the identification of abnormal brain activity caused by several disease conditions. However, Alzheimer's disease (AD)-related brain entropy mapping based on R-fMRI has not been assessed. Here, we measured the sample entropy and voxel-wise connectivity of the network degree centrality (DC) of the intrinsic brain activity acquired by R-fMRI in 26 patients with AD and 26 healthy controls. Compared with the controls, AD patients showed increased entropy in the middle temporal gyrus and the precentral gyrus and also showed decreased DC in the precuneus. Moreover, the magnitude of the negative correlation between local brain activity (entropy) and network connectivity (DC) was increased in AD patients in comparison with healthy controls. These findings provide new evidence on AD-related brain entropy alterations.

  18. The Increase of the Functional Entropy of the Human Brain with Age

    PubMed Central

    Yao, Y.; Lu, W. L.; Xu, B.; Li, C. B.; Lin, C. P.; Waxman, D.; Feng, J. F.

    2013-01-01

    We use entropy to characterize intrinsic ageing properties of the human brain. Analysis of fMRI data from a large dataset of individuals, using resting state BOLD signals, demonstrated that a functional entropy associated with brain activity increases with age. During an average lifespan, the entropy, which was calculated from a population of individuals, increased by approximately 0.1 bits, due to correlations in BOLD activity becoming more widely distributed. We attribute this to the number of excitatory neurons and the excitatory conductance decreasing with age. Incorporating these properties into a computational model leads to quantitatively similar results to the fMRI data. Our dataset involved males and females and we found significant differences between them. The entropy of males at birth was lower than that of females. However, the entropies of the two sexes increase at different rates, and intersect at approximately 50 years; after this age, males have a larger entropy. PMID:24103922

  19. A network approach to the geometric structure of shallow cloud fields

    NASA Astrophysics Data System (ADS)

    Glassmeier, F.; Feingold, G.

    2017-12-01

    The representation of shallow clouds and their radiative impact is one of the largest challenges for global climate models. While the bulk properties of cloud fields, including effects of organization, are a very active area of research, the potential of the geometric arrangement of cloud fields for the development of new parameterizations has hardly been explored. Self-organized patterns are particularly evident in the cellular structure of Stratocumulus (Sc) clouds so readily visible in satellite imagery. Inspired by similar patterns in biology and physics, we approach pattern formation in Sc fields from the perspective of natural cellular networks. Our network analysis is based on large-eddy simulations of open- and closed-cell Sc cases. We find the network structure to be neither random nor characteristic to natural convection. It is independent of macroscopic cloud fields properties like the Sc regime (open vs closed) and its typical length scale (boundary layer height). The latter is a consequence of entropy maximization (Lewis's Law with parameter 0.16). The cellular pattern is on average hexagonal, where non-6 sided cells occur according to a neighbor-number distribution variance of about 2. Reflecting the continuously renewing dynamics of Sc fields, large (many-sided) cells tend to neighbor small (few-sided) cells (Aboav-Weaire Law with parameter 0.9). These macroscopic network properties emerge independent of the Sc regime because the different processes governing the evolution of closed as compared to open cells correspond to topologically equivalent network dynamics. By developing a heuristic model, we show that open and closed cell dynamics can both be mimicked by versions of cell division and cell disappearance and are biased towards the expansion of smaller cells. This model offers for the first time a fundamental and universal explanation for the geometric pattern of Sc clouds. It may contribute to the development of advanced Sc parameterizations. As an outlook, we discuss how a similar network approach can be applied to describe and quantify the geometric structure of shallow cumulus cloud fields.

  20. An alternative expression to the Sackur-Tetrode entropy formula for an ideal gas

    NASA Astrophysics Data System (ADS)

    Nagata, Shoichi

    2018-03-01

    An expression for the entropy of a monoatomic classical ideal gas is known as the Sackur-Tetrode equation. This pioneering investigation about 100 years ago incorporates quantum considerations. The purpose of this paper is to provide an alternative expression for the entropy in terms of the Heisenberg uncertainty relation. The analysis is made on the basis of fluctuation theory, for a canonical system in thermal equilibrium at temperature T. This new formula indicates manifestly that the entropy of macroscopic world is recognized as a measure of uncertainty in microscopic quantum world. The entropy in the Sackur-Tetrode equation can be re-interpreted from a different perspective viewpoint. The emphasis is on the connection between the entropy and the uncertainty relation in quantum consideration.

  1. Measurement of entanglement entropy in the two-dimensional Potts model using wavelet analysis.

    PubMed

    Tomita, Yusuke

    2018-05-01

    A method is introduced to measure the entanglement entropy using a wavelet analysis. Using this method, the two-dimensional Haar wavelet transform of a configuration of Fortuin-Kasteleyn (FK) clusters is performed. The configuration represents a direct snapshot of spin-spin correlations since spin degrees of freedom are traced out in FK representation. A snapshot of FK clusters loses image information at each coarse-graining process by the wavelet transform. It is shown that the loss of image information measures the entanglement entropy in the Potts model.

  2. Innovative techniques to analyze time series of geomagnetic activity indices

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  3. Radiative entropy generation in a gray absorbing, emitting, and scattering planar medium at radiative equilibrium

    NASA Astrophysics Data System (ADS)

    Sadeghi, Pegah; Safavinejad, Ali

    2017-11-01

    Radiative entropy generation through a gray absorbing, emitting, and scattering planar medium at radiative equilibrium with diffuse-gray walls is investigated. The radiative transfer equation and radiative entropy generation equations are solved using discrete ordinates method. Components of the radiative entropy generation are considered for two different boundary conditions: two walls are at a prescribed temperature and mixed boundary conditions, which one wall is at a prescribed temperature and the other is at a prescribed heat flux. The effect of wall emissivities, optical thickness, single scattering albedo, and anisotropic-scattering factor on the entropy generation is attentively investigated. The results reveal that entropy generation in the system mainly arises from irreversible radiative transfer at wall with lower temperature. Total entropy generation rate for the system with prescribed temperature at walls remarkably increases as wall emissivity increases; conversely, for system with mixed boundary conditions, total entropy generation rate slightly decreases. Furthermore, as the optical thickness increases, total entropy generation rate remarkably decreases for the system with prescribed temperature at walls; nevertheless, for the system with mixed boundary conditions, total entropy generation rate increases. The variation of single scattering albedo does not considerably affect total entropy generation rate. This parametric analysis demonstrates that the optical thickness and wall emissivities have a significant effect on the entropy generation in the system at radiative equilibrium. Considering the parameters affecting radiative entropy generation significantly, provides an opportunity to optimally design or increase overall performance and efficiency by applying entropy minimization techniques for the systems at radiative equilibrium.

  4. Gender-specific heart rate dynamics in severe intrauterine growth-restricted fetuses.

    PubMed

    Gonçalves, Hernâni; Bernardes, João; Ayres-de-Campos, Diogo

    2013-06-01

    Management of intrauterine growth restriction (IUGR) remains a major issue in perinatology. The objective of this paper was the assessment of gender-specific fetal heart rate (FHR) dynamics as a diagnostic tool in severe IUGR. FHR was analyzed in the antepartum period in 15 severe IUGR fetuses and 18 controls, matched for gestational age, in relation to fetal gender. Linear and entropy methods, such as mean FHR (mFHR), low (LF), high (HF) and movement frequency (MF), approximate, sample and multiscale entropy. Sensitivities and specificities were estimated using Fisher linear discriminant analysis and the leave-one-out method. Overall, IUGR fetuses presented significantly lower mFHR and entropy compared with controls. However, gender-specific analysis showed that significantly lower mFHR was only evident in IUGR males and lower entropy in IUGR females. In addition, lower LF/(MF+HF) was patent in IUGR females compared with controls, but not in males. Rather high sensitivities and specificities were achieved in the detection of the FHR recordings related with IUGR male fetuses, when gender-specific analysis was performed at gestational ages less than 34 weeks. Severe IUGR fetuses present gender-specific linear and entropy FHR changes, compared with controls, characterized by a significantly lower entropy and sympathetic-vagal balance in females than in males. These findings need to be considered in order to achieve better diagnostic results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Testing a common ice-ocean parameterization with laboratory experiments

    NASA Astrophysics Data System (ADS)

    McConnochie, C. D.; Kerr, R. C.

    2017-07-01

    Numerical models of ice-ocean interactions typically rely upon a parameterization for the transport of heat and salt to the ice face that has not been satisfactorily validated by observational or experimental data. We compare laboratory experiments of ice-saltwater interactions to a common numerical parameterization and find a significant disagreement in the dependence of the melt rate on the fluid velocity. We suggest a resolution to this disagreement based on a theoretical analysis of the boundary layer next to a vertical heated plate, which results in a threshold fluid velocity of approximately 4 cm/s at driving temperatures between 0.5 and 4°C, above which the form of the parameterization should be valid.

  6. Entropy and convexity for nonlinear partial differential equations

    PubMed Central

    Ball, John M.; Chen, Gui-Qiang G.

    2013-01-01

    Partial differential equations are ubiquitous in almost all applications of mathematics, where they provide a natural mathematical description of many phenomena involving change in physical, chemical, biological and social processes. The concept of entropy originated in thermodynamics and statistical physics during the nineteenth century to describe the heat exchanges that occur in the thermal processes in a thermodynamic system, while the original notion of convexity is for sets and functions in mathematics. Since then, entropy and convexity have become two of the most important concepts in mathematics. In particular, nonlinear methods via entropy and convexity have been playing an increasingly important role in the analysis of nonlinear partial differential equations in recent decades. This opening article of the Theme Issue is intended to provide an introduction to entropy, convexity and related nonlinear methods for the analysis of nonlinear partial differential equations. We also provide a brief discussion about the content and contributions of the papers that make up this Theme Issue. PMID:24249768

  7. Entropy and convexity for nonlinear partial differential equations.

    PubMed

    Ball, John M; Chen, Gui-Qiang G

    2013-12-28

    Partial differential equations are ubiquitous in almost all applications of mathematics, where they provide a natural mathematical description of many phenomena involving change in physical, chemical, biological and social processes. The concept of entropy originated in thermodynamics and statistical physics during the nineteenth century to describe the heat exchanges that occur in the thermal processes in a thermodynamic system, while the original notion of convexity is for sets and functions in mathematics. Since then, entropy and convexity have become two of the most important concepts in mathematics. In particular, nonlinear methods via entropy and convexity have been playing an increasingly important role in the analysis of nonlinear partial differential equations in recent decades. This opening article of the Theme Issue is intended to provide an introduction to entropy, convexity and related nonlinear methods for the analysis of nonlinear partial differential equations. We also provide a brief discussion about the content and contributions of the papers that make up this Theme Issue.

  8. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    NASA Astrophysics Data System (ADS)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  9. Differentiating benign from malignant mediastinal lymph nodes visible at EBUS using grey-scale textural analysis.

    PubMed

    Edey, Anthony J; Pollentine, Adrian; Doody, Claire; Medford, Andrew R L

    2015-04-01

    Recent data suggest that grey-scale textural analysis on endobronchial ultrasound (EBUS) imaging can differentiate benign from malignant lymphadenopathy. The objective of studies was to evaluate grey-scale textural analysis and examine its clinical utility. Images from 135 consecutive clinically indicated EBUS procedures were evaluated retrospectively using MATLAB software (MathWorks, Natick, MA, USA). Manual node mapping was performed to obtain a region of interest and grey-scale textural features (range of pixel values and entropy) were analysed. The initial analysis involved 94 subjects and receiver operating characteristic (ROC) curves were generated. The ROC thresholds were then applied on a second cohort (41 subjects) to validate the earlier findings. A total of 371 images were evaluated. There was no difference in proportions of malignant disease (56% vs 53%, P = 0.66) in the prediction (group 1) and validation (group 2) sets. There was no difference in range of pixel values in group 1 but entropy was significantly higher in the malignant group (5.95 vs 5.77, P = 0.03). Higher entropy was seen in adenocarcinoma versus lymphoma (6.00 vs 5.50, P < 0.05). An ROC curve for entropy gave an area under the curve of 0.58 with 51% sensitivity and 71% specificity for entropy greater than 5.94 for malignancy. In group 2, the entropy threshold phenotyped only 47% of benign cases and 20% of malignant cases correctly. These findings suggest that use of EBUS grey-scale textural analysis for differentiation of malignant from benign lymphadenopathy may not be accurate. Further studies are required. © 2015 Asian Pacific Society of Respirology.

  10. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  11. Enhancement of heat transfer and entropy generation analysis of nanofluids turbulent convection flow in square section tubes

    NASA Astrophysics Data System (ADS)

    Bianco, Vincenzo; Nardini, Sergio; Manca, Oronzio

    2011-12-01

    In this article, developing turbulent forced convection flow of a water-Al2O3 nanofluid in a square tube, subjected to constant and uniform wall heat flux, is numerically investigated. The mixture model is employed to simulate the nanofluid flow and the investigation is accomplished for particles size equal to 38 nm. An entropy generation analysis is also proposed in order to find the optimal working condition for the given geometry under given boundary conditions. A simple analytical procedure is proposed to evaluate the entropy generation and its results are compared with the numerical calculations, showing a very good agreement. A comparison of the resulting Nusselt numbers with experimental correlations available in literature is accomplished. To minimize entropy generation, the optimal Reynolds number is determined.

  12. Sample entropy applied to the analysis of synthetic time series and tachograms

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.; Gálvez-Coyt, G. G.; Solís-Montufar, E.

    2017-01-01

    Entropy is a method of non-linear analysis that allows an estimate of the irregularity of a system, however, there are different types of computational entropy that were considered and tested in order to obtain one that would give an index of signals complexity taking into account the data number of the analysed time series, the computational resources demanded by the method, and the accuracy of the calculation. An algorithm for the generation of fractal time-series with a certain value of β was used for the characterization of the different entropy algorithms. We obtained a significant variation for most of the algorithms in terms of the series size, which could result counterproductive for the study of real signals of different lengths. The chosen method was sample entropy, which shows great independence of the series size. With this method, time series of heart interbeat intervals or tachograms of healthy subjects and patients with congestive heart failure were analysed. The calculation of sample entropy was carried out for 24-hour tachograms and time subseries of 6-hours for sleepiness and wakefulness. The comparison between the two populations shows a significant difference that is accentuated when the patient is sleeping.

  13. Pressure transfer function of a JT15D nozzle due to acoustic and convected entropy fluctuations

    NASA Astrophysics Data System (ADS)

    Miles, J. H.

    An acoustic transmission matrix analysis of sound propagation in a variable area duct with and without flow is extended to include convected entropy fluctuations. The boundary conditions used in the analysis are a transfer function relating entropy and pressure at the nozzle inlet and the nozzle exit impedance. The nozzle pressure transfer function calculated is compared with JT15D turbofan engine nozzle data. The one dimensional theory for sound propagation in a variable area nozzle with flow but without convected entropy is good at the low engine speeds where the nozzle exit Mach number is low (M=0.2) and the duct exit impedance model is good. The effect of convected entropy appears to be so negligible that it is obscured by the inaccuracy of the nozzle exit impedance model, the lack of information on the magnitude of the convected entropy and its phase relationship with the pressure, and the scatter in the data. An improved duct exit impedance model is required at the higher engine speeds where the nozzle exit Mach number is high (M=0.56) and at low frequencies (below 120 Hz).

  14. Entropy production of a Brownian ellipsoid in the overdamped limit.

    PubMed

    Marino, Raffaele; Eichhorn, Ralf; Aurell, Erik

    2016-01-01

    We analyze the translational and rotational motion of an ellipsoidal Brownian particle from the viewpoint of stochastic thermodynamics. The particle's Brownian motion is driven by external forces and torques and takes place in an heterogeneous thermal environment where friction coefficients and (local) temperature depend on space and time. Our analysis of the particle's stochastic thermodynamics is based on the entropy production associated with single particle trajectories. It is motivated by the recent discovery that the overdamped limit of vanishing inertia effects (as compared to viscous fricion) produces a so-called "anomalous" contribution to the entropy production, which has no counterpart in the overdamped approximation, when inertia effects are simply discarded. Here we show that rotational Brownian motion in the overdamped limit generates an additional contribution to the "anomalous" entropy. We calculate its specific form by performing a systematic singular perturbation analysis for the generating function of the entropy production. As a side result, we also obtain the (well-known) equations of motion in the overdamped limit. We furthermore investigate the effects of particle shape and give explicit expressions of the "anomalous entropy" for prolate and oblate spheroids and for near-spherical Brownian particles.

  15. Predictive Compensator Optimization for Head Tracking Lag in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Adelstein, Barnard D.; Jung, Jae Y.; Ellis, Stephen R.

    2001-01-01

    We examined the perceptual impact of plant noise parameterization for Kalman Filter predictive compensation of time delays intrinsic to head tracked virtual environments (VEs). Subjects were tested in their ability to discriminate between the VE system's minimum latency and conditions in which artificially added latency was then predictively compensated back to the system minimum. Two head tracking predictors were parameterized off-line according to cost functions that minimized prediction errors in (1) rotation, and (2) rotation projected into translational displacement with emphasis on higher frequency human operator noise. These predictors were compared with a parameterization obtained from the VE literature for cost function (1). Results from 12 subjects showed that both parameterization type and amount of compensated latency affected discrimination. Analysis of the head motion used in the parameterizations and the subsequent discriminability results suggest that higher frequency predictor artifacts are contributory cues for discriminating the presence of predictive compensation.

  16. [Evaluation of a simplified index (spectral entropy) about sleep state of electrocardiogram recorded by a simplified polygraph, MemCalc-Makin2].

    PubMed

    Ohisa, Noriko; Ogawa, Hiromasa; Murayama, Nobuki; Yoshida, Katsumi

    2010-02-01

    Polysomnography (PSG) is the gold standard for the diagnosis of sleep apnea hypopnea syndrome (SAHS), but it takes time to analyze the PSG and PSG cannot be performed repeatedly because of efforts and costs. Therefore, simplified sleep respiratory disorder indices in which are reflected the PSG results are needed. The Memcalc method, which is a combination of the maximum entropy method for spectral analysis and the non-linear least squares method for fitting analysis (Makin2, Suwa Trust, Tokyo, Japan) has recently been developed. Spectral entropy which is derived by the Memcalc method might be useful to expressing the trend of time-series behavior. Spectral entropy of ECG which is calculated with the Memcalc method was evaluated by comparing to the PSG results. Obstructive SAS patients (n = 79) and control volanteer (n = 7) ECG was recorded using MemCalc-Makin2 (GMS) with PSG recording using Alice IV (Respironics) from 20:00 to 6:00. Spectral entropy of ECG, which was calculated every 2 seconds using the Memcalc method, was compared to sleep stages which were analyzed manually from PSG recordings. Spectral entropy value (-0.473 vs. -0.418, p < 0.05) were significantly increased in the OSAHS compared to the control. For the entropy cutoff level of -0.423, sensitivity and specificity for OSAHS were 86.1% and 71.4%, respectively, resulting in a receiver operating characteristic with an area under the curve of 0.837. The absolute value of entropy had inverse correlation with stage 3. Spectral entropy, which was calculated with Memcalc method, might be a possible index evaluating the quality of sleep.

  17. Investigating dynamical complexity in the magnetosphere using various entropy measures

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos

    2009-09-01

    The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather applications.

  18. Nonlinear radiative heat flux and heat source/sink on entropy generation minimization rate

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Khan, M. Waleed Ahmed; Khan, M. Ijaz; Alsaedi, A.

    2018-06-01

    Entropy generation minimization in nonlinear radiative mixed convective flow towards a variable thicked surface is addressed. Entropy generation for momentum and temperature is carried out. The source for this flow analysis is stretching velocity of sheet. Transformations are used to reduce system of partial differential equations into ordinary ones. Total entropy generation rate is determined. Series solutions for the zeroth and mth order deformation systems are computed. Domain of convergence for obtained solutions is identified. Velocity, temperature and concentration fields are plotted and interpreted. Entropy equation is studied through nonlinear mixed convection and radiative heat flux. Velocity and temperature gradients are discussed through graphs. Meaningful results are concluded in the final remarks.

  19. Foreign exchange rate entropy evolution during financial crises

    NASA Astrophysics Data System (ADS)

    Stosic, Darko; Stosic, Dusan; Ludermir, Teresa; de Oliveira, Wilson; Stosic, Tatijana

    2016-05-01

    This paper examines the effects of financial crises on foreign exchange (FX) markets, where entropy evolution is measured for different exchange rates, using the time-dependent block entropy method. Empirical results suggest that financial crises are associated with significant increase of exchange rate entropy, reflecting instability in FX market dynamics. In accordance with phenomenological expectations, it is found that FX markets with large liquidity and large trading volume are more inert - they recover quicker from a crisis than markets with small liquidity and small trading volume. Moreover, our numerical analysis shows that periods of economic uncertainty are preceded by periods of low entropy values, which may serve as a tool for anticipating the onset of financial crises.

  20. An Equation for Moist Entropy in a Precipitating and Icy Atmosphere

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Simpson, Joanne; Zeng, Xiping

    2003-01-01

    Moist entropy is nearly conserved in adiabatic motion. It is redistributed rather than created by moist convection. Thus moist entropy and its equation, as a healthy direction, can be used to construct analytical and numerical models for the interaction between tropical convective clouds and large-scale circulations. Hence, an accurate equation of moist entropy is needed for the analysis and modeling of atmospheric convective clouds. On the basis of the consistency between the energy and the entropy equations, a complete equation of moist entropy is derived from the energy equation. The equation expresses explicitly the internal and external sources of moist entropy, including those in relation to the microphysics of clouds and precipitation. In addition, an accurate formula for the surface flux of moist entropy from the underlying surface into the air above is derived. Because moist entropy deals "easily" with the transition among three water phases, it will be used as a prognostic variable in the next generation of cloud-resolving models (e. g. a global cloud-resolving model) for low computational noise. Its equation that is derived in this paper is accurate and complete, providing a theoretical basis for using moist entropy as a prognostic variable in the long-term modeling of clouds and large-scale circulations.

  1. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy.

    PubMed

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  2. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  3. Entropy Filtered Density Function for Large Eddy Simulation of Turbulent Reacting Flows

    NASA Astrophysics Data System (ADS)

    Safari, Mehdi

    Analysis of local entropy generation is an effective means to optimize the performance of energy and combustion systems by minimizing the irreversibilities in transport processes. Large eddy simulation (LES) is employed to describe entropy transport and generation in turbulent reacting flows. The entropy transport equation in LES contains several unclosed terms. These are the subgrid scale (SGS) entropy flux and entropy generation caused by irreversible processes: heat conduction, mass diffusion, chemical reaction and viscous dissipation. The SGS effects are taken into account using a novel methodology based on the filtered density function (FDF). This methodology, entitled entropy FDF (En-FDF), is developed and utilized in the form of joint entropy-velocity-scalar-turbulent frequency FDF and the marginal scalar-entropy FDF, both of which contain the chemical reaction effects in a closed form. The former constitutes the most comprehensive form of the En-FDF and provides closure for all the unclosed filtered moments. This methodology is applied for LES of a turbulent shear layer involving transport of passive scalars. Predictions show favor- able agreements with the data generated by direct numerical simulation (DNS) of the same layer. The marginal En-FDF accounts for entropy generation effects as well as scalar and entropy statistics. This methodology is applied to a turbulent nonpremixed jet flame (Sandia Flame D) and predictions are validated against experimental data. In both flows, sources of irreversibility are predicted and analyzed.

  4. Constraints to Dark Energy Using PADE Parameterizations

    NASA Astrophysics Data System (ADS)

    Rezaei, M.; Malekjani, M.; Basilakos, S.; Mehrabi, A.; Mota, D. F.

    2017-07-01

    We put constraints on dark energy (DE) properties using PADE parameterization, and compare it to the same constraints using Chevalier-Polarski-Linder (CPL) and ΛCDM, at both the background and the perturbation levels. The DE equation of the state parameter of the models is derived following the mathematical treatment of PADE expansion. Unlike CPL parameterization, PADE approximation provides different forms of the equation of state parameter that avoid the divergence in the far future. Initially we perform a likelihood analysis in order to put constraints on the model parameters using solely background expansion data, and we find that all parameterizations are consistent with each other. Then, combining the expansion and the growth rate data, we test the viability of PADE parameterizations and compare them with CPL and ΛCDM models, respectively. Specifically, we find that the growth rate of the current PADE parameterizations is lower than ΛCDM model at low redshifts, while the differences among the models are negligible at high redshifts. In this context, we provide for the first time a growth index of linear matter perturbations in PADE cosmologies. Considering that DE is homogeneous, we recover the well-known asymptotic value of the growth index (namely {γ }∞ =\\tfrac{3({w}∞ -1)}{6{w}∞ -5}), while in the case of clustered DE, we obtain {γ }∞ ≃ \\tfrac{3{w}∞ (3{w}∞ -5)}{(6{w}∞ -5)(3{w}∞ -1)}. Finally, we generalize the growth index analysis in the case where γ is allowed to vary with redshift, and we find that the form of γ (z) in PADE parameterization extends that of the CPL and ΛCDM cosmologies, respectively.

  5. Transfer Entropy as a Log-Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  6. Transfer entropy as a log-likelihood ratio.

    PubMed

    Barnett, Lionel; Bossomaier, Terry

    2012-09-28

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  7. Effect of Entropy Generation on Wear Mechanics and System Reliability

    NASA Astrophysics Data System (ADS)

    Gidwani, Akshay; James, Siddanth; Jagtap, Sagar; Karthikeyan, Ram; Vincent, S.

    2018-04-01

    Wear is an irreversible phenomenon. Processes such as mutual sliding and rolling between materials involve entropy generation. These processes are monotonic with respect to time. The concept of entropy generation is further quantified using Degradation Entropy Generation theorem formulated by Michael D. Bryant. The sliding-wear model can be extrapolated to different instances in order to further provide a potential analysis of machine prognostics as well as system and process reliability for various processes besides even mere mechanical processes. In other words, using the concept of ‘entropy generation’ and wear, one can quantify the reliability of a system with respect to time using a thermodynamic variable, which is the basis of this paper. Thus in the present investigation, a unique attempt has been made to establish correlation between entropy-wear-reliability which can be useful technique in preventive maintenance.

  8. Germinal center texture entropy as possible indicator of humoral immune response: immunophysiology viewpoint.

    PubMed

    Pantic, Igor; Pantic, Senka

    2012-10-01

    In this article, we present the results indicating that spleen germinal center (GC) texture entropy determined by gray-level co-occurrence matrix (GLCM) method is related to humoral immune response. Spleen tissue was obtained from eight outbred male short-haired guinea pigs previously immunized by sheep red blood cells (SRBC). A total of 312 images from 39 germinal centers (156 GC light zone images and 156 GC dark zone images) were acquired and analyzed by GLCM method. Angular second moment, contrast, correlation, entropy, and inverse difference moment were calculated for each image. Humoral immune response to SRBC was measured using T cell-dependent antibody response (TDAR) assay. Statistically highly significant negative correlation was detected between light zone entropy and the number of TDAR plaque-forming cells (r (s) = -0.86, p < 0.01). The entropy decreased as the plaque-forming cells increased and vice versa. A statistically significant negative correlation was also detected between dark zone entropy values and the number of plaque-forming cells (r (s) = -0.69, p < 0.05). Germinal center texture entropy may be a powerful indicator of humoral immune response. This study is one of the first to point out the potential scientific value of GLCM image texture analysis in lymphoid tissue cytoarchitecture evaluation. Lymphoid tissue texture analysis could become an important and affordable addition to the conventional immunophysiology techniques.

  9. Entropy and generalized least square methods in assessment of the regional value of streamgages

    USGS Publications Warehouse

    Markus, M.; Vernon, Knapp H.; Tasker, Gary D.

    2003-01-01

    The Illinois State Water Survey performed a study to assess the streamgaging network in the State of Illinois. One of the important aspects of the study was to assess the regional value of each station through an assessment of the information transfer among gaging records for low, average, and high flow conditions. This analysis was performed for the main hydrologic regions in the State, and the stations were initially evaluated using a new approach based on entropy analysis. To determine the regional value of each station within a region, several information parameters, including total net information, were defined based on entropy. Stations were ranked based on the total net information. For comparison, the regional value of the same stations was assessed using the generalized least square regression (GLS) method, developed by the US Geological Survey. Finally, a hybrid combination of GLS and entropy was created by including a function of the negative net information as a penalty function in the GLS. The weights of the combined model were determined to maximize the average correlation with the results of GLS and entropy. The entropy and GLS methods were evaluated using the high-flow data from southern Illinois stations. The combined method was compared with the entropy and GLS approaches using the high-flow data from eastern Illinois stations. ?? 2003 Elsevier B.V. All rights reserved.

  10. Driver fatigue detection through multiple entropy fusion analysis in an EEG-based system.

    PubMed

    Min, Jianliang; Wang, Ping; Hu, Jianfeng

    2017-01-01

    Driver fatigue is an important contributor to road accidents, and fatigue detection has major implications for transportation safety. The aim of this research is to analyze the multiple entropy fusion method and evaluate several channel regions to effectively detect a driver's fatigue state based on electroencephalogram (EEG) records. First, we fused multiple entropies, i.e., spectral entropy, approximate entropy, sample entropy and fuzzy entropy, as features compared with autoregressive (AR) modeling by four classifiers. Second, we captured four significant channel regions according to weight-based electrodes via a simplified channel selection method. Finally, the evaluation model for detecting driver fatigue was established with four classifiers based on the EEG data from four channel regions. Twelve healthy subjects performed continuous simulated driving for 1-2 hours with EEG monitoring on a static simulator. The leave-one-out cross-validation approach obtained an accuracy of 98.3%, a sensitivity of 98.3% and a specificity of 98.2%. The experimental results verified the effectiveness of the proposed method, indicating that the multiple entropy fusion features are significant factors for inferring the fatigue state of a driver.

  11. Investigation of FeNiCrWMn - a new high entropy alloy

    NASA Astrophysics Data System (ADS)

    Buluc, G.; Florea, I.; Bălţătescu, O.; Florea, R. M.; Carcea, I.

    2015-11-01

    The term of high entropy alloys started from the analysis of multicomponent alloys, which were produced at an experimental level since 1995 by developing a new concept related to the development of metallic materials. Recent developments in the field of high-entropy alloys have revealed that they have versatile properties like: ductility, toughness, hardness and corrosion resistance [1]. Up until now, it has been demonstrated that the explored this alloys are feasible to be synthesized, processed and analyzed contrary to the misunderstanding based on traditional experiences. Moreover, there are many opportunities in this field for academic studies and industrial applications [1, 2]. As the combinations of composition and process for producing high entropy alloys are numerous and each high entropy alloy has its own microstructure and properties to be identified and understood, the research work is truly limitless. The novelty of these alloys consists of chemical composition. These alloys have been named high entropy alloys due to the atomic scale mixing entropies higher than traditional alloys. In this paper, I will present the microscopy and the mechanical properties of high entropy alloy FeNiCrWMn.

  12. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  13. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  14. Psychoacoustic entropy theory and its implications for performance practice

    NASA Astrophysics Data System (ADS)

    Strohman, Gregory J.

    This dissertation attempts to motivate, derive and imply potential uses for a generalized perceptual theory of musical harmony called psychoacoustic entropy theory. This theory treats the human auditory system as a physical system which takes acoustic measurements. As a result, the human auditory system is subject to all the appropriate uncertainties and limitations of other physical measurement systems. This is the theoretic basis for defining psychoacoustic entropy. Psychoacoustic entropy is a numerical quantity which indexes the degree to which the human auditory system perceives instantaneous disorder within a sound pressure wave. Chapter one explains the importance of harmonic analysis as a tool for performance practice. It also outlines the critical limitations for many of the most influential historical approaches to modeling harmonic stability, particularly when compared to available scientific research in psychoacoustics. Rather than analyze a musical excerpt, psychoacoustic entropy is calculated directly from sound pressure waves themselves. This frames psychoacoustic entropy theory in the most general possible terms as a theory of musical harmony, enabling it to be invoked for any perceivable sound. Chapter two provides and examines many widely accepted mathematical models of the acoustics and psychoacoustics of these sound pressure waves. Chapter three introduces entropy as a precise way of measuring perceived uncertainty in sound pressure waves. Entropy is used, in combination with the acoustic and psychoacoustic models introduced in chapter two, to motivate the mathematical formulation of psychoacoustic entropy theory. Chapter four shows how to use psychoacoustic entropy theory to analyze the certain types of musical harmonies, while chapter five applies the analytical tools developed in chapter four to two short musical excerpts to influence their interpretation. Almost every form of harmonic analysis invokes some degree of mathematical reasoning. However, the limited scope of most harmonic systems used for Western common practice music greatly simplifies the necessary level of mathematical detail. Psychoacoustic entropy theory requires a greater deal of mathematical complexity due to its sheer scope as a generalized theory of musical harmony. Fortunately, under specific assumptions the theory can take on vastly simpler forms. Psychoacoustic entropy theory appears to be highly compatible with the latest scientific research in psychoacoustics. However, the theory itself should be regarded as a hypothesis and this dissertation an experiment in progress. The evaluation of psychoacoustic entropy theory as a scientific theory of human sonic perception must wait for more rigorous future research.

  15. Evaluation of wave runup predictions from numerical and parametric models

    USGS Publications Warehouse

    Stockdon, Hilary F.; Thompson, David M.; Plant, Nathaniel G.; Long, Joseph W.

    2014-01-01

    Wave runup during storms is a primary driver of coastal evolution, including shoreline and dune erosion and barrier island overwash. Runup and its components, setup and swash, can be predicted from a parameterized model that was developed by comparing runup observations to offshore wave height, wave period, and local beach slope. Because observations during extreme storms are often unavailable, a numerical model is used to simulate the storm-driven runup to compare to the parameterized model and then develop an approach to improve the accuracy of the parameterization. Numerically simulated and parameterized runup were compared to observations to evaluate model accuracies. The analysis demonstrated that setup was accurately predicted by both the parameterized model and numerical simulations. Infragravity swash heights were most accurately predicted by the parameterized model. The numerical model suffered from bias and gain errors that depended on whether a one-dimensional or two-dimensional spatial domain was used. Nonetheless, all of the predictions were significantly correlated to the observations, implying that the systematic errors can be corrected. The numerical simulations did not resolve the incident-band swash motions, as expected, and the parameterized model performed best at predicting incident-band swash heights. An assimilated prediction using a weighted average of the parameterized model and the numerical simulations resulted in a reduction in prediction error variance. Finally, the numerical simulations were extended to include storm conditions that have not been previously observed. These results indicated that the parameterized predictions of setup may need modification for extreme conditions; numerical simulations can be used to extend the validity of the parameterized predictions of infragravity swash; and numerical simulations systematically underpredict incident swash, which is relatively unimportant under extreme conditions.

  16. Evaluation of five dry particle deposition parameterizations for incorporation into atmospheric transport models

    NASA Astrophysics Data System (ADS)

    Khan, Tanvir R.; Perlinger, Judith A.

    2017-10-01

    Despite considerable effort to develop mechanistic dry particle deposition parameterizations for atmospheric transport models, current knowledge has been inadequate to propose quantitative measures of the relative performance of available parameterizations. In this study, we evaluated the performance of five dry particle deposition parameterizations developed by Zhang et al. (2001) (Z01), Petroff and Zhang (2010) (PZ10), Kouznetsov and Sofiev (2012) (KS12), Zhang and He (2014) (ZH14), and Zhang and Shao (2014) (ZS14), respectively. The evaluation was performed in three dimensions: model ability to reproduce observed deposition velocities, Vd (accuracy); the influence of imprecision in input parameter values on the modeled Vd (uncertainty); and identification of the most influential parameter(s) (sensitivity). The accuracy of the modeled Vd was evaluated using observations obtained from five land use categories (LUCs): grass, coniferous and deciduous forests, natural water, and ice/snow. To ascertain the uncertainty in modeled Vd, and quantify the influence of imprecision in key model input parameters, a Monte Carlo uncertainty analysis was performed. The Sobol' sensitivity analysis was conducted with the objective to determine the parameter ranking from the most to the least influential. Comparing the normalized mean bias factors (indicators of accuracy), we find that the ZH14 parameterization is the most accurate for all LUCs except for coniferous forest, for which it is second most accurate. From Monte Carlo simulations, the estimated mean normalized uncertainties in the modeled Vd obtained for seven particle sizes (ranging from 0.005 to 2.5 µm) for the five LUCs are 17, 12, 13, 16, and 27 % for the Z01, PZ10, KS12, ZH14, and ZS14 parameterizations, respectively. From the Sobol' sensitivity results, we suggest that the parameter rankings vary by particle size and LUC for a given parameterization. Overall, for dp = 0.001 to 1.0 µm, friction velocity was one of the three most influential parameters in all parameterizations. For giant particles (dp = 10 µm), relative humidity was the most influential parameter. Because it is the least complex of the five parameterizations, and it has the greatest accuracy and least uncertainty, we propose that the ZH14 parameterization is currently superior for incorporation into atmospheric transport models.

  17. Wavelet Packet Entropy for Heart Murmurs Classification

    PubMed Central

    Safara, Fatemeh; Doraisamy, Shyamala; Azman, Azreen; Jantan, Azrul; Ranga, Sri

    2012-01-01

    Heart murmurs are the first signs of cardiac valve disorders. Several studies have been conducted in recent years to automatically differentiate normal heart sounds, from heart sounds with murmurs using various types of audio features. Entropy was successfully used as a feature to distinguish different heart sounds. In this paper, new entropy was introduced to analyze heart sounds and the feasibility of using this entropy in classification of five types of heart sounds and murmurs was shown. The entropy was previously introduced to analyze mammograms. Four common murmurs were considered including aortic regurgitation, mitral regurgitation, aortic stenosis, and mitral stenosis. Wavelet packet transform was employed for heart sound analysis, and the entropy was calculated for deriving feature vectors. Five types of classification were performed to evaluate the discriminatory power of the generated features. The best results were achieved by BayesNet with 96.94% accuracy. The promising results substantiate the effectiveness of the proposed wavelet packet entropy for heart sounds classification. PMID:23227043

  18. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  19. Understanding Chemical Equilibrium Using Entropy Analysis: The Relationship between [delta]S[subscript tot](sys[superscript o]) and the Equilibrium Constant

    ERIC Educational Resources Information Center

    Bindel, Thomas H.

    2010-01-01

    Entropy analyses as a function of the extent of reaction are presented for a number of physicochemical processes, including vaporization of a liquid, dimerization of nitrogen dioxide, and the autoionization of water. Graphs of the total entropy change versus the extent of reaction give a visual representation of chemical equilibrium and the second…

  20. Mood states modulate complexity in heartbeat dynamics: A multiscale entropy analysis

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Nardelli, M.; Bertschy, G.; Lanata, A.; Scilingo, E. P.

    2014-07-01

    This paper demonstrates that heartbeat complex dynamics is modulated by different pathological mental states. Multiscale entropy analysis was performed on R-R interval series gathered from the electrocardiogram of eight bipolar patients who exhibited mood states among depression, hypomania, and euthymia, i.e., good affective balance. Three different methodologies for the choice of the sample entropy radius value were also compared. We show that the complexity level can be used as a marker of mental states being able to discriminate among the three pathological mood states, suggesting to use heartbeat complexity as a more objective clinical biomarker for mental disorders.

  1. Statistical mechanical theory for steady state systems. II. Reciprocal relations and the second entropy.

    PubMed

    Attard, Phil

    2005-04-15

    The concept of second entropy is introduced for the dynamic transitions between macrostates. It is used to develop a theory for fluctuations in velocity, and is exemplified by deriving Onsager reciprocal relations for Brownian motion. The cases of free, driven, and pinned Brownian particles are treated in turn, and Stokes' law is derived. The second entropy analysis is applied to the general case of thermodynamic fluctuations, and the Onsager reciprocal relations for these are derived using the method. The Green-Kubo formulas for the transport coefficients emerge from the analysis, as do Langevin dynamics.

  2. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  3. Evaluation of Surface Flux Parameterizations with Long-Term ARM Observations

    DOE PAGES

    Liu, Gang; Liu, Yangang; Endo, Satoshi

    2013-02-01

    Surface momentum, sensible heat, and latent heat fluxes are critical for atmospheric processes such as clouds and precipitation, and are parameterized in a variety of models ranging from cloud-resolving models to large-scale weather and climate models. However, direct evaluation of the parameterization schemes for these surface fluxes is rare due to limited observations. This study takes advantage of the long-term observations of surface fluxes collected at the Southern Great Plains site by the Department of Energy Atmospheric Radiation Measurement program to evaluate the six surface flux parameterization schemes commonly used in the Weather Research and Forecasting (WRF) model and threemore » U.S. general circulation models (GCMs). The unprecedented 7-yr-long measurements by the eddy correlation (EC) and energy balance Bowen ratio (EBBR) methods permit statistical evaluation of all six parameterizations under a variety of stability conditions, diurnal cycles, and seasonal variations. The statistical analyses show that the momentum flux parameterization agrees best with the EC observations, followed by latent heat flux, sensible heat flux, and evaporation ratio/Bowen ratio. The overall performance of the parameterizations depends on atmospheric stability, being best under neutral stratification and deteriorating toward both more stable and more unstable conditions. Further diagnostic analysis reveals that in addition to the parameterization schemes themselves, the discrepancies between observed and parameterized sensible and latent heat fluxes may stem from inadequate use of input variables such as surface temperature, moisture availability, and roughness length. The results demonstrate the need for improving the land surface models and measurements of surface properties, which would permit the evaluation of full land surface models.« less

  4. Spectral cumulus parameterization based on cloud-resolving model

    NASA Astrophysics Data System (ADS)

    Baba, Yuya

    2018-02-01

    We have developed a spectral cumulus parameterization using a cloud-resolving model. This includes a new parameterization of the entrainment rate which was derived from analysis of the cloud properties obtained from the cloud-resolving model simulation and was valid for both shallow and deep convection. The new scheme was examined in a single-column model experiment and compared with the existing parameterization of Gregory (2001, Q J R Meteorol Soc 127:53-72) (GR scheme). The results showed that the GR scheme simulated more shallow and diluted convection than the new scheme. To further validate the physical performance of the parameterizations, Atmospheric Model Intercomparison Project (AMIP) experiments were performed, and the results were compared with reanalysis data. The new scheme performed better than the GR scheme in terms of mean state and variability of atmospheric circulation, i.e., the new scheme improved positive bias of precipitation in western Pacific region, and improved positive bias of outgoing shortwave radiation over the ocean. The new scheme also simulated better features of convectively coupled equatorial waves and Madden-Julian oscillation. These improvements were found to be derived from the modification of parameterization for the entrainment rate, i.e., the proposed parameterization suppressed excessive increase of entrainment, thus suppressing excessive increase of low-level clouds.

  5. Increased temperature and entropy production in cancer: the role of anti-inflammatory drugs.

    PubMed

    Pitt, Michael A

    2015-02-01

    Some cancers have been shown to have a higher temperature than surrounding normal tissue. This higher temperature is due to heat generated internally in the cancer. The higher temperature of cancer (compared to surrounding tissue) enables a thermodynamic analysis to be carried out. Here I show that there is increased entropy production in cancer compared with surrounding tissue. This is termed excess entropy production. The excess entropy production is expressed in terms of heat flow from the cancer to surrounding tissue and enzymic reactions in the cancer and surrounding tissue. The excess entropy production in cancer drives it away from the stationary state that is characterised by minimum entropy production. Treatments that reduce inflammation (and therefore temperature) should drive a cancer towards the stationary state. Anti-inflammatory agents, such as aspirin, other non-steroidal anti-inflammatory drugs, corticosteroids and also thyroxine analogues have been shown (using various criteria) to reduce the progress of cancer.

  6. Processing Doppler Lidar and Cloud Radar Observations for Analysis of Convective Mass Flux Parameterizations Using DYNAMO Direct Observations

    DTIC Science & Technology

    2014-09-30

    for Analysis of Convective Mass Flux Parameterizations Using DYNAMO Direct Observations R. Michael Hardesty CIRES/University of Colorado/NOAA 325...the RV-Revell during legs 2 & 3 of the DYNAMO experiement to help characterize vertical transport through the boundary layer and to build statistics...obtained during DYNAMO , and to investigate whether cold pools that emanate from convection organize the interplay between humidity and convection and

  7. Correlation as a Determinant of Configurational Entropy in Supramolecular and Protein Systems

    PubMed Central

    2015-01-01

    For biomolecules in solution, changes in configurational entropy are thought to contribute substantially to the free energies of processes like binding and conformational change. In principle, the configurational entropy can be strongly affected by pairwise and higher-order correlations among conformational degrees of freedom. However, the literature offers mixed perspectives regarding the contributions that changes in correlations make to changes in configurational entropy for such processes. Here we take advantage of powerful techniques for simulation and entropy analysis to carry out rigorous in silico studies of correlation in binding and conformational changes. In particular, we apply information-theoretic expansions of the configurational entropy to well-sampled molecular dynamics simulations of a model host–guest system and the protein bovine pancreatic trypsin inhibitor. The results bear on the interpretation of NMR data, as they indicate that changes in correlation are important determinants of entropy changes for biologically relevant processes and that changes in correlation may either balance or reinforce changes in first-order entropy. The results also highlight the importance of main-chain torsions as contributors to changes in protein configurational entropy. As simulation techniques grow in power, the mathematical techniques used here will offer new opportunities to answer challenging questions about complex molecular systems. PMID:24702693

  8. Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations

    PubMed Central

    Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro

    2015-01-01

    Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic. PMID:26177039

  9. Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations.

    PubMed

    Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro

    2015-01-01

    Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic.

  10. Financial time series analysis based on effective phase transfer entropy

    NASA Astrophysics Data System (ADS)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  11. Image Analysis Using Quantum Entropy Scale Space and Diffusion Concepts

    DTIC Science & Technology

    2009-11-01

    images using a combination of analytic methods and prototype Matlab and Mathematica programs. We investigated concepts of generalized entropy and...Schmidt strength from quantum logic gate decomposition. This form of entropy gives a measure of the nonlocal content of an entangling logic gate...11 We recall that the Schmidt number is an indicator of entanglement , but not a measure of entanglement . For instance, let us compare

  12. NOTE: Entropy-based automated classification of independent components separated from fMCG

    NASA Astrophysics Data System (ADS)

    Comani, S.; Srinivasan, V.; Alleva, G.; Romani, G. L.

    2007-03-01

    Fetal magnetocardiography (fMCG) is a noninvasive technique suitable for the prenatal diagnosis of the fetal heart function. Reliable fetal cardiac signals can be reconstructed from multi-channel fMCG recordings by means of independent component analysis (ICA). However, the identification of the separated components is usually accomplished by visual inspection. This paper discusses a novel automated system based on entropy estimators, namely approximate entropy (ApEn) and sample entropy (SampEn), for the classification of independent components (ICs). The system was validated on 40 fMCG datasets of normal fetuses with the gestational age ranging from 22 to 37 weeks. Both ApEn and SampEn were able to measure the stability and predictability of the physiological signals separated with ICA, and the entropy values of the three categories were significantly different at p <0.01. The system performances were compared with those of a method based on the analysis of the time and frequency content of the components. The outcomes of this study showed a superior performance of the entropy-based system, in particular for early gestation, with an overall ICs detection rate of 98.75% and 97.92% for ApEn and SampEn respectively, as against a value of 94.50% obtained with the time-frequency-based system.

  13. Dispersion entropy for the analysis of resting-state MEG regularity in Alzheimer's disease.

    PubMed

    Azami, Hamed; Rostaghi, Mostafa; Fernandez, Alberto; Escudero, Javier

    2016-08-01

    Alzheimer's disease (AD) is a progressive degenerative brain disorder affecting memory, thinking, behaviour and emotion. It is the most common form of dementia and a big social problem in western societies. The analysis of brain activity may help to diagnose this disease. Changes in entropy methods have been reported useful in research studies to characterize AD. We have recently proposed dispersion entropy (DisEn) as a very fast and powerful tool to quantify the irregularity of time series. The aim of this paper is to evaluate the ability of DisEn, in comparison with fuzzy entropy (FuzEn), sample entropy (SampEn), and permutation entropy (PerEn), to discriminate 36 AD patients from 26 elderly control subjects using resting-state magnetoencephalogram (MEG) signals. The results obtained by DisEn, FuzEn, and SampEn, unlike PerEn, show that the AD patients' signals are more regular than controls' time series. The p-values obtained by DisEn, FuzEn, SampEn, and PerEn based methods demonstrate the superiority of DisEn over PerEn, SampEn, and PerEn. Moreover, the computation time for the newly proposed DisEn-based method is noticeably less than for the FuzEn, SampEn, and PerEn based approaches.

  14. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  15. Parameterization Interactions in Global Aquaplanet Simulations

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Ritthik; Bordoni, Simona; Suselj, Kay; Teixeira, João.

    2018-02-01

    Global climate simulations rely on parameterizations of physical processes that have scales smaller than the resolved ones. In the atmosphere, these parameterizations represent moist convection, boundary layer turbulence and convection, cloud microphysics, longwave and shortwave radiation, and the interaction with the land and ocean surface. These parameterizations can generate different climates involving a wide range of interactions among parameterizations and between the parameterizations and the resolved dynamics. To gain a simplified understanding of a subset of these interactions, we perform aquaplanet simulations with the global version of the Weather Research and Forecasting (WRF) model employing a range (in terms of properties) of moist convection and boundary layer (BL) parameterizations. Significant differences are noted in the simulated precipitation amounts, its partitioning between convective and large-scale precipitation, as well as in the radiative impacts. These differences arise from the way the subcloud physics interacts with convection, both directly and through various pathways involving the large-scale dynamics and the boundary layer, convection, and clouds. A detailed analysis of the profiles of the different tendencies (from the different physical processes) for both potential temperature and water vapor is performed. While different combinations of convection and boundary layer parameterizations can lead to different climates, a key conclusion of this study is that similar climates can be simulated with model versions that are different in terms of the partitioning of the tendencies: the vertically distributed energy and water balances in the tropics can be obtained with significantly different profiles of large-scale, convection, and cloud microphysics tendencies.

  16. Polarimetric Decomposition Analysis of the Deepwater Horizon Oil Slick Using L-Band UAVSAR Data

    NASA Technical Reports Server (NTRS)

    Jones, Cathleen; Minchew, Brent; Holt, Benjamin

    2011-01-01

    We report here an analysis of the polarization dependence of L-band radar backscatter from the main slick of the Deepwater Horizon oil spill, with specific attention to the utility of polarimetric decomposition analysis for discrimination of oil from clean water and identification of variations in the oil characteristics. For this study we used data collected with the UAVSAR instrument from opposing look directions directly over the main oil slick. We find that both the Cloude-Pottier and Shannon entropy polarimetric decomposition methods offer promise for oil discrimination, with the Shannon entropy method yielding the same information as contained in the Cloude-Pottier entropy and averaged in tensity parameters, but with significantly less computational complexity

  17. Separation of Intercepted Multi-Radar Signals Based on Parameterized Time-Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Lu, W. L.; Xie, J. W.; Wang, H. M.; Sheng, C.

    2016-09-01

    Modern radars use complex waveforms to obtain high detection performance and low probabilities of interception and identification. Signals intercepted from multiple radars overlap considerably in both the time and frequency domains and are difficult to separate with primary time parameters. Time-frequency analysis (TFA), as a key signal-processing tool, can provide better insight into the signal than conventional methods. In particular, among the various types of TFA, parameterized time-frequency analysis (PTFA) has shown great potential to investigate the time-frequency features of such non-stationary signals. In this paper, we propose a procedure for PTFA to separate overlapped radar signals; it includes five steps: initiation, parameterized time-frequency analysis, demodulating the signal of interest, adaptive filtering and recovering the signal. The effectiveness of the method was verified with simulated data and an intercepted radar signal received in a microwave laboratory. The results show that the proposed method has good performance and has potential in electronic reconnaissance applications, such as electronic intelligence, electronic warfare support measures, and radar warning.

  18. Modified cross sample entropy and surrogate data analysis method for financial time series

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2015-09-01

    For researching multiscale behaviors from the angle of entropy, we propose a modified cross sample entropy (MCSE) and combine surrogate data analysis with it in order to compute entropy differences between original dynamics and surrogate series (MCSDiff). MCSDiff is applied to simulated signals to show accuracy and then employed to US and Chinese stock markets. We illustrate the presence of multiscale behavior in the MCSDiff results and reveal that there are synchrony containing in the original financial time series and they have some intrinsic relations, which are destroyed by surrogate data analysis. Furthermore, the multifractal behaviors of cross-correlations between these financial time series are investigated by multifractal detrended cross-correlation analysis (MF-DCCA) method, since multifractal analysis is a multiscale analysis. We explore the multifractal properties of cross-correlation between these US and Chinese markets and show the distinctiveness of NQCI and HSI among the markets in their own region. It can be concluded that the weaker cross-correlation between US markets gives the evidence for the better inner mechanism in the US stock markets than that of Chinese stock markets. To study the multiscale features and properties of financial time series can provide valuable information for understanding the inner mechanism of financial markets.

  19. Comparing Postural Stability Entropy Analyses to Differentiate Fallers and Non-Fallers

    PubMed Central

    Fino, Peter C.; Mojdehi, Ahmad R.; Adjerid, Khaled; Habibi, Mohammad; Lockhart, Thurmon E.; Ross, Shane D.

    2015-01-01

    The health and financial cost of falls has spurred research to differentiate the characteristics of fallers and non-fallers. Postural stability has received much of the attention with recent studies exploring various measures of entropy. This study compared the discriminatory ability of several entropy methods at differentiating two paradigms in the center-of-pressure (COP) of elderly individuals: 1.) eyes open (EO) versus eyes closed (EC) and 2.) fallers (F) versus non-fallers (NF). Methods were compared using the area under the curve (AUC) of the receiver-operating characteristic (ROC) curves developed from logistic regression models. Overall, multiscale entropy (MSE) and composite multiscale entropy (CompMSE) performed the best with AUCs of 0.71 for EO/EC and 0.77 for F/NF. When methods were combined together to maximize the AUC, the entropy classifier had an AUC of for 0.91 the F/NF comparison. These results suggest researchers and clinicians attempting to create clinical tests to identify fallers should consider a combination of every entropy method when creating a classifying test. Additionally, MSE and CompMSE classifiers using polar coordinate data outperformed rectangular coordinate data, encouraging more research into the most appropriate time series for postural stability entropy analysis. PMID:26464267

  20. Ecosystem functioning and maximum entropy production: a quantitative test of hypotheses.

    PubMed

    Meysman, Filip J R; Bruers, Stijn

    2010-05-12

    The idea that entropy production puts a constraint on ecosystem functioning is quite popular in ecological thermodynamics. Yet, until now, such claims have received little quantitative verification. Here, we examine three 'entropy production' hypotheses that have been forwarded in the past. The first states that increased entropy production serves as a fingerprint of living systems. The other two hypotheses invoke stronger constraints. The state selection hypothesis states that when a system can attain multiple steady states, the stable state will show the highest entropy production rate. The gradient response principle requires that when the thermodynamic gradient increases, the system's new stable state should always be accompanied by a higher entropy production rate. We test these three hypotheses by applying them to a set of conventional food web models. Each time, we calculate the entropy production rate associated with the stable state of the ecosystem. This analysis shows that the first hypothesis holds for all the food webs tested: the living state shows always an increased entropy production over the abiotic state. In contrast, the state selection and gradient response hypotheses break down when the food web incorporates more than one trophic level, indicating that they are not generally valid.

  1. Comparing Postural Stability Entropy Analyses to Differentiate Fallers and Non-fallers.

    PubMed

    Fino, Peter C; Mojdehi, Ahmad R; Adjerid, Khaled; Habibi, Mohammad; Lockhart, Thurmon E; Ross, Shane D

    2016-05-01

    The health and financial cost of falls has spurred research to differentiate the characteristics of fallers and non-fallers. Postural stability has received much of the attention with recent studies exploring various measures of entropy. This study compared the discriminatory ability of several entropy methods at differentiating two paradigms in the center-of-pressure of elderly individuals: (1) eyes open (EO) vs. eyes closed (EC) and (2) fallers (F) vs. non-fallers (NF). Methods were compared using the area under the curve (AUC) of the receiver-operating characteristic curves developed from logistic regression models. Overall, multiscale entropy (MSE) and composite multiscale entropy (CompMSE) performed the best with AUCs of 0.71 for EO/EC and 0.77 for F/NF. When methods were combined together to maximize the AUC, the entropy classifier had an AUC of for 0.91 the F/NF comparison. These results suggest researchers and clinicians attempting to create clinical tests to identify fallers should consider a combination of every entropy method when creating a classifying test. Additionally, MSE and CompMSE classifiers using polar coordinate data outperformed rectangular coordinate data, encouraging more research into the most appropriate time series for postural stability entropy analysis.

  2. Driver fatigue detection through multiple entropy fusion analysis in an EEG-based system

    PubMed Central

    Min, Jianliang; Wang, Ping

    2017-01-01

    Driver fatigue is an important contributor to road accidents, and fatigue detection has major implications for transportation safety. The aim of this research is to analyze the multiple entropy fusion method and evaluate several channel regions to effectively detect a driver's fatigue state based on electroencephalogram (EEG) records. First, we fused multiple entropies, i.e., spectral entropy, approximate entropy, sample entropy and fuzzy entropy, as features compared with autoregressive (AR) modeling by four classifiers. Second, we captured four significant channel regions according to weight-based electrodes via a simplified channel selection method. Finally, the evaluation model for detecting driver fatigue was established with four classifiers based on the EEG data from four channel regions. Twelve healthy subjects performed continuous simulated driving for 1–2 hours with EEG monitoring on a static simulator. The leave-one-out cross-validation approach obtained an accuracy of 98.3%, a sensitivity of 98.3% and a specificity of 98.2%. The experimental results verified the effectiveness of the proposed method, indicating that the multiple entropy fusion features are significant factors for inferring the fatigue state of a driver. PMID:29220351

  3. Optical display for radar sensing

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Willey, Jefferson; Landa, Joseph; Hsieh, Minder; Larsen, Louis V.; Krzywicki, Alan T.; Tran, Binh Q.; Hoekstra, Philip; Dillard, John T.; Krapels, Keith A.; Wardlaw, Michael; Chu, Kai-Dee

    2015-05-01

    Boltzmann headstone S = kB Log W turns out to be the Rosette stone for Greek physics translation optical display of the microwave sensing hieroglyphics. The LHS is the molecular entropy S measuring the degree of uniformity scattering off the sensing cross sections. The RHS is the inverse relationship (equation) predicting the Planck radiation spectral distribution parameterized by the Kelvin temperature T. Use is made of the conservation energy law of the heat capacity of Reservoir (RV) change T Δ S = -ΔE equals to the internal energy change of black box (bb) subsystem. Moreover, an irreversible thermodynamics Δ S > 0 for collision mixing toward totally larger uniformity of heat death, asserted by Boltzmann, that derived the so-called Maxwell-Boltzmann canonical probability. Given the zero boundary condition black box, Planck solved a discrete standing wave eigenstates (equation). Together with the canonical partition function (equation) an average ensemble average of all possible internal energy yielded the celebrated Planck radiation spectral (equation) where the density of states (equation). In summary, given the multispectral sensing data (equation), we applied Lagrange Constraint Neural Network (LCNN) to solve the Blind Sources Separation (BSS) for a set of equivalent bb target temperatures. From the measurements of specific value, slopes and shapes we can fit a set of Kelvin temperatures T's for each bb targets. As a result, we could apply the analytical continuation for each entropy sources along the temperature-unique Planck spectral curves always toward the RGB color temperature display for any sensing probing frequency.

  4. Idealized modeling of convective organization with changing sea surface temperatures using multiple equilibria in weak temperature gradient simulations

    NASA Astrophysics Data System (ADS)

    Sentić, Stipo; Sessions, Sharon L.

    2017-06-01

    The weak temperature gradient (WTG) approximation is a method of parameterizing the influences of the large scale on local convection in limited domain simulations. WTG simulations exhibit multiple equilibria in precipitation; depending on the initial moisture content, simulations can precipitate or remain dry for otherwise identical boundary conditions. We use a hypothesized analogy between multiple equilibria in precipitation in WTG simulations, and dry and moist regions of organized convection to study tropical convective organization. We find that the range of wind speeds that support multiple equilibria depends on sea surface temperature (SST). Compared to the present SST, low SSTs support a narrower range of multiple equilibria at higher wind speeds. In contrast, high SSTs exhibit a narrower range of multiple equilibria at low wind speeds. This suggests that at high SSTs, organized convection might occur with lower surface forcing. To characterize convection at different SSTs, we analyze the change in relationships between precipitation rate, atmospheric stability, moisture content, and the large-scale transport of moist entropy and moisture with increasing SSTs. We find an increase in large-scale export of moisture and moist entropy from dry simulations with increasing SST, which is consistent with a strengthening of the up-gradient transport of moisture from dry regions to moist regions in organized convection. Furthermore, the changes in diagnostic relationships with SST are consistent with more intense convection in precipitating regions of organized convection for higher SSTs.

  5. Entropy Generation Analysis in Convective Ferromagnetic Nano Blood Flow Through a Composite Stenosed Arteries with Permeable Wall

    NASA Astrophysics Data System (ADS)

    Sher Akbar, Noreen; Wahid Butt, Adil

    2017-05-01

    The study of heat transfer is of significant importance in many biological and biomedical industry problems. This investigation comprises of the study of entropy generation analysis of the blood flow in the arteries with permeable walls. The convection through the flow is studied with compliments to the entropy generation. Governing problem is formulized and solved for low Reynold’s number and long wavelength approximations. Exact analytical solutions have been obtained and are analyzed graphically. It is seen that temperature for pure water is lower as compared to the copper water. It gains magnitude with an increase in the slip parameter.

  6. Entropy generation in a mixed convection Poiseulle flow of molybdenum disulphide Jeffrey nanofluid

    NASA Astrophysics Data System (ADS)

    Gul, Aaiza; Khan, Ilyas; Makhanov, Stanislav S.

    2018-06-01

    Entropy analysis in a mixed convection Poiseulle flow of a Molybdenum Disulphide Jeffrey Nanofluid (MDJN) is presented. Mixed convection is caused due to buoyancy force and external pressure gradient. The problem is formulated in terms of a boundary value problem for a system of partial differential equations. An analytical solution for the velocity and the temperature is obtained using the perturbation technique. Entropy generation has been derived as a function of the velocity and temperature gradients. The solutions are displayed graphically and the relevant importance of the input parameters is discussed. A Jeffrey nanofluid (JN) has been compared with a second grade nanofluid (SGN) and Newtonian nanofluid (NN). It is found that the entropy generation decreases when the temperature increases whereas increasing the Brickman number increases entropy generation.

  7. All the entropies on the light-cone

    NASA Astrophysics Data System (ADS)

    Casini, Horacio; Testé, Eduardo; Torroba, Gonzalo

    2018-05-01

    We determine the explicit universal form of the entanglement and Renyi entropies, for regions with arbitrary boundary on a null plane or the light-cone. All the entropies are shown to saturate the strong subadditive inequality. This Renyi Markov property implies that the vacuum behaves like a product state. For the null plane, our analysis applies to general quantum field theories, and we show that the entropies do not depend on the region. For the light-cone, our approach is restricted to conformal field theories. In this case, the construction of the entropies is related to dilaton effective actions in two less dimensions. In particular, the universal logarithmic term in the entanglement entropy arises from a Wess-Zumino anomaly action. We also consider these properties in theories with holographic duals, for which we construct the minimal area surfaces for arbitrary shapes on the light-cone. We recover the Markov property and the universal form of the entropy, and argue that these properties continue to hold upon including stringy and quantum corrections. We end with some remarks on the recently proved entropic a-theorem in four spacetime dimensions.

  8. Noise and complexity in human postural control: interpreting the different estimations of entropy.

    PubMed

    Rhea, Christopher K; Silver, Tobin A; Hong, S Lee; Ryu, Joong Hyun; Studenka, Breanna E; Hughes, Charmayne M L; Haddad, Jeffrey M

    2011-03-17

    Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses.

  9. Parameterizing the Transport Pathways for Cell Invasion in Complex Scaffold Architectures

    PubMed Central

    Ashworth, Jennifer C.; Mehr, Marco; Buxton, Paul G.; Best, Serena M.

    2016-01-01

    Interconnecting pathways through porous tissue engineering scaffolds play a vital role in determining nutrient supply, cell invasion, and tissue ingrowth. However, the global use of the term “interconnectivity” often fails to describe the transport characteristics of these pathways, giving no clear indication of their potential to support tissue synthesis. This article uses new experimental data to provide a critical analysis of reported methods for the description of scaffold transport pathways, ranging from qualitative image analysis to thorough structural parameterization using X-ray Micro-Computed Tomography. In the collagen scaffolds tested in this study, it was found that the proportion of pore space perceived to be accessible dramatically changed depending on the chosen method of analysis. Measurements of % interconnectivity as defined in this manner varied as a function of direction and connection size, and also showed a dependence on measurement length scale. As an alternative, a method for transport pathway parameterization was investigated, using percolation theory to calculate the diameter of the largest sphere that can travel to infinite distance through a scaffold in a specified direction. As proof of principle, this approach was used to investigate the invasion behavior of primary fibroblasts in response to independent changes in pore wall alignment and pore space accessibility, parameterized using the percolation diameter. The result was that both properties played a distinct role in determining fibroblast invasion efficiency. This example therefore demonstrates the potential of the percolation diameter as a method of transport pathway parameterization, to provide key structural criteria for application-based scaffold design. PMID:26888449

  10. Spatial analysis of cities using Renyi entropy and fractal parameters

    NASA Astrophysics Data System (ADS)

    Chen, Yanguang; Feng, Jian

    2017-12-01

    The spatial distributions of cities fall into two groups: one is the simple distribution with characteristic scale (e.g. exponential distribution), and the other is the complex distribution without characteristic scale (e.g. power-law distribution). The latter belongs to scale-free distributions, which can be modeled with fractal geometry. However, fractal dimension is not suitable for the former distribution. In contrast, spatial entropy can be used to measure any types of urban distributions. This paper is devoted to generalizing multifractal parameters by means of dual relation between Euclidean and fractal geometries. The main method is mathematical derivation and empirical analysis, and the theoretical foundation is the discovery that the normalized fractal dimension is equal to the normalized entropy. Based on this finding, a set of useful spatial indexes termed dummy multifractal parameters are defined for geographical analysis. These indexes can be employed to describe both the simple distributions and complex distributions. The dummy multifractal indexes are applied to the population density distribution of Hangzhou city, China. The calculation results reveal the feature of spatio-temporal evolution of Hangzhou's urban morphology. This study indicates that fractal dimension and spatial entropy can be combined to produce a new methodology for spatial analysis of city development.

  11. Entropy method of measuring and evaluating periodicity of quasi-periodic trajectories

    NASA Astrophysics Data System (ADS)

    Ni, Yanshuo; Turitsyn, Konstantin; Baoyin, Hexi; Junfeng, Li

    2018-06-01

    This paper presents a method for measuring the periodicity of quasi-periodic trajectories by applying discrete Fourier transform (DFT) to the trajectories and analyzing the frequency domain within the concept of entropy. Having introduced the concept of entropy, analytical derivation and numerical results indicate that entropies increase as a logarithmic function of time. Periodic trajectories typically have higher entropies, and trajectories with higher entropies mean the periodicities of the motions are stronger. Theoretical differences between two trajectories expressed as summations of trigonometric functions are also derived analytically. Trajectories in the Henon-Heiles system and the circular restricted three-body problem (CRTBP) are analyzed with the indicator entropy and compared with orthogonal fast Lyapunov indicator (OFLI). The results show that entropy is a better tool for discriminating periodicity in quasiperiodic trajectories than OFLI and can detect periodicity while excluding the spirals that are judged as periodic cases by OFLI. Finally, trajectories in the vicinity of 243 Ida and 6489 Golevka are considered as examples, and the numerical results verify these conclusions. Some trajectories near asteroids look irregular, but their higher entropy values as analyzed by this method serve as evidence of frequency regularity in three directions. Moreover, these results indicate that applying DFT to the trajectories in the vicinity of irregular small bodies and calculating their entropy in the frequency domain provides a useful quantitative analysis method for evaluating orderliness in the periodicity of quasi-periodic trajectories within a given time interval.

  12. Local entropy difference upon a substrate binding of a psychrophilic α-amylase and a mesophilic homologue

    NASA Astrophysics Data System (ADS)

    Kosugi, Takahiro; Hayashi, Shigehiko

    2011-01-01

    Psychrophilic α-amylase from the antarctic bacterium pseudoalteromonashaloplanktis (AHA) and its mesophilic homologue, porcine pancreatic α-amylase (PPA) are theoretically investigated with molecular dynamics (MD) simulations. We carried out 240-ns MD simulations for four systems, AHA and PPA with/without the bound substrate, and examined protein conformational entropy changes upon the substrate binding. We developed an analysis that decomposes the entropy changes into contributions of individual amino acids, and successfully identified protein regions responsible for the entropy changes. The results provide a molecular insight into the structural flexibilities of those enzymes related to the temperature dependences of the enzymatic activity.

  13. Elastic full-waveform inversion and parameterization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    NASA Astrophysics Data System (ADS)

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    2018-03-01

    Seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter tradeoff, arising from the covariance between different physical parameters, which increases nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parameterization and acquisition arrangement. An appropriate choice of model parameterization is critical to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parameterizations in isotropic-elastic FWI with walk-away vertical seismic profile (W-VSP) dataset for unconventional heavy oil reservoir characterization. Six model parameterizations are considered: velocity-density (α, β and ρ΄), modulus-density (κ, μ and ρ), Lamé-density (λ, μ΄ and ρ‴), impedance-density (IP, IS and ρ″), velocity-impedance-I (α΄, β΄ and I_P^'), and velocity-impedance-II (α″, β″ and I_S^'). We begin analyzing the interparameter tradeoff by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. In this paper, we discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter tradeoffs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter tradeoffs for various model parameterizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parameterization, the inverted density profile can be over-estimated, under-estimated or spatially distorted. Among the six cases, only the velocity-density parameterization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. The heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson's ratios, can be identified clearly with the inverted isotropic-elastic parameters.

  14. Elastic full-waveform inversion and parameterization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    DOE PAGES

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    2018-03-06

    We report seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter tradeoff, arising from the covariance between different physical parameters, which increases nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parameterization and acquisition arrangement. An appropriate choice of model parameterization is critical to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parameterizations in isotropic-elastic FWI with walk-away vertical seismicmore » profile (W-VSP) dataset for unconventional heavy oil reservoir characterization. Six model parameterizations are considered: velocity-density (α, β and ρ'), modulus-density (κ, μ and ρ), Lamé-density (λ, μ' and ρ'''), impedance-density (IP, IS and ρ''), velocity-impedance-I (α', β' and I' P), and velocity-impedance-II (α'', β'' and I'S). We begin analyzing the interparameter tradeoff by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. In this paper, we discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter tradeoffs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter tradeoffs for various model parameterizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parameterization, the inverted density profile can be over-estimated, under-estimated or spatially distorted. Among the six cases, only the velocity-density parameterization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. Finally, the heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson’s ratios, can be identified clearly with the inverted isotropic-elastic parameters.« less

  15. Elastic full-waveform inversion and parameterization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    We report seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter tradeoff, arising from the covariance between different physical parameters, which increases nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parameterization and acquisition arrangement. An appropriate choice of model parameterization is critical to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parameterizations in isotropic-elastic FWI with walk-away vertical seismicmore » profile (W-VSP) dataset for unconventional heavy oil reservoir characterization. Six model parameterizations are considered: velocity-density (α, β and ρ'), modulus-density (κ, μ and ρ), Lamé-density (λ, μ' and ρ'''), impedance-density (IP, IS and ρ''), velocity-impedance-I (α', β' and I' P), and velocity-impedance-II (α'', β'' and I'S). We begin analyzing the interparameter tradeoff by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. In this paper, we discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter tradeoffs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter tradeoffs for various model parameterizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parameterization, the inverted density profile can be over-estimated, under-estimated or spatially distorted. Among the six cases, only the velocity-density parameterization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. Finally, the heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson’s ratios, can be identified clearly with the inverted isotropic-elastic parameters.« less

  16. Usage of Parameterized Fatigue Spectra and Physics-Based Systems Engineering Models for Wind Turbine Component Sizing: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsons, Taylor; Guo, Yi; Veers, Paul

    Software models that use design-level input variables and physics-based engineering analysis for estimating the mass and geometrical properties of components in large-scale machinery can be very useful for analyzing design trade-offs in complex systems. This study uses DriveSE, an OpenMDAO-based drivetrain model that uses stress and deflection criteria to size drivetrain components within a geared, upwind wind turbine. Because a full lifetime fatigue load spectrum can only be defined using computationally-expensive simulations in programs such as FAST, a parameterized fatigue loads spectrum that depends on wind conditions, rotor diameter, and turbine design life has been implemented. The parameterized fatigue spectrummore » is only used in this paper to demonstrate the proposed fatigue analysis approach. This paper details a three-part investigation of the parameterized approach and a comparison of the DriveSE model with and without fatigue analysis on the main shaft system. It compares loads from three turbines of varying size and determines if and when fatigue governs drivetrain sizing compared to extreme load-driven design. It also investigates the model's sensitivity to shaft material parameters. The intent of this paper is to demonstrate how fatigue considerations in addition to extreme loads can be brought into a system engineering optimization.« less

  17. Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level

    NASA Astrophysics Data System (ADS)

    Silva, Carlos; Annamalai, Kalyan

    2008-06-01

    The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB) and Centers for Disease Control and Prevention (CDC), which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death) was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years). From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized.

  18. Analysis of cardiac signals using spatial filling index and time-frequency domain

    PubMed Central

    Faust, Oliver; Acharya U, Rajendra; Krishnan, SM; Min, Lim Choo

    2004-01-01

    Background Analysis of heart rate variation (HRV) has become a popular noninvasive tool for assessing the activities of the autonomic nervous system (ANS). HRV analysis is based on the concept that fast fluctuations may specifically reflect changes of sympathetic and vagal activity. It shows that the structure generating the signal is not simply linear, but also involves nonlinear contributions. These signals are essentially non-stationary; may contain indicators of current disease, or even warnings about impending diseases. The indicators may be present at all times or may occur at random in the time scale. However, to study and pinpoint abnormalities in voluminous data collected over several hours is strenuous and time consuming. Methods This paper presents the spatial filling index and time-frequency analysis of heart rate variability signal for disease identification. Renyi's entropy is evaluated for the signal in the Wigner-Ville and Continuous Wavelet Transformation (CWT) domain. Results This Renyi's entropy gives lower 'p' value for scalogram than Wigner-Ville distribution and also, the contours of scalogram visually show the features of the diseases. And in the time-frequency analysis, the Renyi's entropy gives better result for scalogram than the Wigner-Ville distribution. Conclusion Spatial filling index and Renyi's entropy has distinct regions for various diseases with an accuracy of more than 95%. PMID:15361254

  19. Sensitivity of Pacific Cold Tongue and Double-ITCZ Bias to Convective Parameterization

    NASA Astrophysics Data System (ADS)

    Woelfle, M.; Bretherton, C. S.; Pritchard, M. S.; Yu, S.

    2016-12-01

    Many global climate models struggle to accurately simulate annual mean precipitation and sea surface temperature (SST) fields in the tropical Pacific basin. Precipitation biases are dominated by the double intertropical convergence zone (ITCZ) bias where models exhibit precipitation maxima straddling the equator while only a single Northern Hemispheric maximum exists in observations. The major SST bias is the enhancement of the equatorial cold tongue. A series of coupled model simulations are used to investigate the sensitivity of the bias development to convective parameterization. Model components are initialized independently prior to coupling to allow analysis of the transient response of the system directly following coupling. These experiments show precipitation and SST patterns to be highly sensitive to convective parameterization. Simulations in which the deep convective parameterization is disabled forcing all convection to be resolved by the shallow convection parameterization showed a degradation in both the cold tongue and double-ITCZ biases as precipitation becomes focused into off-equatorial regions of local SST maxima. Simulations using superparameterization in place of traditional cloud parameterizations showed a reduced cold tongue bias at the expense of additional precipitation biases. The equatorial SST responses to changes in convective parameterization are driven by changes in near equatorial zonal wind stress. The sensitivity of convection to SST is important in determining the precipitation and wind stress fields. However, differences in convective momentum transport also play a role. While no significant improvement is seen in these simulations of the double-ITCZ, the system's sensitivity to these changes reaffirm that improved convective parameterizations may provide an avenue for improving simulations of tropical Pacific precipitation and SST.

  20. Histogram analysis parameters of dynamic contrast-enhanced magnetic resonance imaging can predict histopathological findings including proliferation potential, cellularity, and nucleic areas in head and neck squamous cell carcinoma.

    PubMed

    Surov, Alexey; Meyer, Hans Jonas; Leifels, Leonard; Höhn, Anne-Kathrin; Richter, Cindy; Winter, Karsten

    2018-04-20

    Our purpose was to analyze possible associations between histogram analysis parameters of dynamic contrast-enhanced magnetic resonance imaging DCE MRI and histopathological findings like proliferation index, cell count and nucleic areas in head and neck squamous cell carcinoma (HNSCC). 30 patients (mean age 57.0 years) with primary HNSCC were included in the study. In every case, histogram analysis parameters of K trans , V e , and K ep were estimated using a mathlab based software. Tumor proliferation index, cell count, and nucleic areas were estimated on Ki 67 antigen stained specimens. Spearman's non-parametric rank sum correlation coefficients were calculated between DCE and different histopathological parameters. KI 67 correlated with K trans min ( p = -0.386, P = 0.043) and s K trans skewness ( p = 0.382, P = 0.045), V e min ( p = -0.473, P = 0.011), Ve entropy ( p = 0.424, P = 0.025), and K ep entropy ( p = 0.464, P = 0.013). Cell count correlated with K trans kurtosis ( p = 0.40, P = 0.034), V e entropy ( p = 0.475, P = 0.011). Total nucleic area correlated with V e max ( p = 0.386, P = 0.042) and V e entropy ( p = 0.411, P = 0.030). In G1/2 tumors, only K trans entropy correlated well with total ( P =0.78, P =0.013) and average nucleic areas ( p = 0.655, P = 0.006). In G3 tumors, KI 67 correlated with Ve min ( p = -0.552, P = 0.022) and V e entropy ( p = 0.524, P = 0.031). Ve max correlated with total nucleic area ( p = 0.483, P = 0.049). Kep max correlated with total area ( p = -0.51, P = 0.037), and K ep entropy with KI 67 ( p = 0.567, P = 0.018). We concluded that histogram-based parameters skewness, kurtosis and entropy of K trans , V e , and K ep can be used as markers for proliferation activity, cellularity and nucleic content in HNSCC. Tumor grading influences significantly associations between perfusion and histopathological parameters.

  1. Modeling Information Content Via Dirichlet-Multinomial Regression Analysis.

    PubMed

    Ferrari, Alberto

    2017-01-01

    Shannon entropy is being increasingly used in biomedical research as an index of complexity and information content in sequences of symbols, e.g. languages, amino acid sequences, DNA methylation patterns and animal vocalizations. Yet, distributional properties of information entropy as a random variable have seldom been the object of study, leading to researchers mainly using linear models or simulation-based analytical approach to assess differences in information content, when entropy is measured repeatedly in different experimental conditions. Here a method to perform inference on entropy in such conditions is proposed. Building on results coming from studies in the field of Bayesian entropy estimation, a symmetric Dirichlet-multinomial regression model, able to deal efficiently with the issue of mean entropy estimation, is formulated. Through a simulation study the model is shown to outperform linear modeling in a vast range of scenarios and to have promising statistical properties. As a practical example, the method is applied to a data set coming from a real experiment on animal communication.

  2. On the relation between correlation dimension, approximate entropy and sample entropy parameters, and a fast algorithm for their calculation

    NASA Astrophysics Data System (ADS)

    Zurek, Sebastian; Guzik, Przemyslaw; Pawlak, Sebastian; Kosmider, Marcin; Piskorski, Jaroslaw

    2012-12-01

    We explore the relation between correlation dimension, approximate entropy and sample entropy parameters, which are commonly used in nonlinear systems analysis. Using theoretical considerations we identify the points which are shared by all these complexity algorithms and show explicitly that the above parameters are intimately connected and mutually interdependent. A new geometrical interpretation of sample entropy and correlation dimension is provided and the consequences for the interpretation of sample entropy, its relative consistency and some of the algorithms for parameter selection for this quantity are discussed. To get an exact algorithmic relation between the three parameters we construct a very fast algorithm for simultaneous calculations of the above, which uses the full time series as the source of templates, rather than the usual 10%. This algorithm can be used in medical applications of complexity theory, as it can calculate all three parameters for a realistic recording of 104 points within minutes with the use of an average notebook computer.

  3. Facial muscle activity, Response Entropy, and State Entropy indices during noxious stimuli in propofol-nitrous oxide or propofol-nitrous oxide-remifentanil anaesthesia without neuromuscular block.

    PubMed

    Aho, A J; Yli-Hankala, A; Lyytikäinen, L-P; Jäntti, V

    2009-02-01

    Entropy is an anaesthetic EEG monitoring method, calculating two numerical parameters: State Entropy (SE, range 0-91) and Response Entropy (RE, range 0-100). Low Entropy numbers indicate unconsciousness. SE uses the frequency range 0.8-32 Hz, representing predominantly the EEG activity. RE is calculated at 0.8-47 Hz, consisting of both EEG and facial EMG. RE-SE difference (RE-SE) can indicate EMG, reflecting nociception. We studied RE-SE and EMG in patients anaesthetized without neuromuscular blockers. Thirty-one women were studied in propofol-nitrous oxide (P) or propofol-nitrous oxide-remifentanil (PR) anaesthesia. Target SE value was 40-60. RE-SE was measured before and after endotracheal intubation, and before and after the commencement of surgery. The spectral content of the signal was analysed off-line. Appearance of EMG on EEG was verified visually. RE, SE, and RE-SE increased during intubation in both groups. Elevated RE was followed by increased SE values in most cases. In these patients, spectral analysis of the signal revealed increased activity starting from low (<20 Hz) frequency area up to the highest measured frequencies. This was associated with appearance of EMG in raw signal. No spectral alterations or EMG were seen in patients with stable Entropy values. Increased RE is followed by increased SE at nociceptive stimuli in patients not receiving neuromuscular blockers. Owing to their overlapping power spectra, the contribution of EMG and EEG cannot be accurately separated with frequency analysis in the range of 10-40 Hz.

  4. Comparison of Texture Analysis Techniques in Both Frequency and Spatial Domains for Cloud Feature Extraction

    DTIC Science & Technology

    1992-01-01

    entropy , energy. variance, skewness, and object. It can also be applied to an image of a phenomenon. It kurtosis. These parameters are then used as...statistic. The co-occurrence matrix method is used in this study to derive texture values of entropy . Limogeneity. energy (similar to the GLDV angular...from working with the co-occurrence matrix method. Seven convolution sizes were chosen to derive the texture values of entropy , local homogeneity, and

  5. Practical quality control tools for curves and surfaces

    NASA Technical Reports Server (NTRS)

    Small, Scott G.

    1992-01-01

    Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.

  6. Parameterization and Observability Analysis of Scalable Battery Clusters for Onboard Thermal Management

    DTIC Science & Technology

    2011-12-01

    the designed parameterization scheme and adaptive observer. A cylindri- cal battery thermal model in Eq. (1) with parameters of an A123 32157 LiFePO4 ...Morcrette, M. and Delacourt, C. (2010) Thermal modeling of a cylindrical LiFePO4 /graphite lithium-ion battery. Journal of Power Sources. 195, 2961

  7. Comparison of different objective functions for parameterization of simple respiration models

    Treesearch

    M.T. van Wijk; B. van Putten; D.Y. Hollinger; A.D. Richardson

    2008-01-01

    The eddy covariance measurements of carbon dioxide fluxes collected around the world offer a rich source for detailed data analysis. Simple, aggregated models are attractive tools for gap filling, budget calculation, and upscaling in space and time. Key in the application of these models is their parameterization and a robust estimate of the uncertainty and reliability...

  8. Optimized Kernel Entropy Components.

    PubMed

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  9. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    PubMed Central

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118

  10. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    NASA Astrophysics Data System (ADS)

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.

  11. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  12. Extremal entanglement and mixedness in continuous variable systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adesso, Gerardo; Serafini, Alessio; Illuminati, Fabrizio

    2004-08-01

    We investigate the relationship between mixedness and entanglement for Gaussian states of continuous variable systems. We introduce generalized entropies based on Schatten p norms to quantify the mixedness of a state and derive their explicit expressions in terms of symplectic spectra. We compare the hierarchies of mixedness provided by such measures with the one provided by the purity (defined as tr {rho}{sup 2} for the state {rho}) for generic n-mode states. We then review the analysis proving the existence of both maximally and minimally entangled states at given global and marginal purities, with the entanglement quantified by the logarithmic negativity.more » Based on these results, we extend such an analysis to generalized entropies, introducing and fully characterizing maximally and minimally entangled states for given global and local generalized entropies. We compare the different roles played by the purity and by the generalized p entropies in quantifying the entanglement and the mixedness of continuous variable systems. We introduce the concept of average logarithmic negativity, showing that it allows a reliable quantitative estimate of continuous variable entanglement by direct measurements of global and marginal generalized p entropies.« less

  13. Entangled de Sitter from stringy axionic Bell pair I: an analysis using Bunch-Davies vacuum

    NASA Astrophysics Data System (ADS)

    Choudhury, Sayantan; Panda, Sudhakar

    2018-01-01

    In this work, we study the quantum entanglement and compute entanglement entropy in de Sitter space for a bipartite quantum field theory driven by an axion originating from Type IIB string compactification on a Calabi-Yau three fold (CY^3) and in the presence of an NS5 brane. For this computation, we consider a spherical surface S^2, which divides the spatial slice of de Sitter (dS_4) into exterior and interior sub-regions. We also consider the initial choice of vacuum to be Bunch-Davies state. First we derive the solution of the wave function of the axion in a hyperbolic open chart by constructing a suitable basis for Bunch-Davies vacuum state using Bogoliubov transformation. We then derive the expression for density matrix by tracing over the exterior region. This allows us to compute the entanglement entropy and Rényi entropy in 3+1 dimension. Furthermore, we quantify the UV-finite contribution of the entanglement entropy which contain the physics of long range quantum correlations of our expanding universe. Finally, our analysis complements the necessary condition for generating non-vanishing entanglement entropy in primordial cosmology due to the axion.

  14. Computing algebraic transfer entropy and coupling directions via transcripts

    NASA Astrophysics Data System (ADS)

    Amigó, José M.; Monetti, Roberto; Graff, Beata; Graff, Grzegorz

    2016-11-01

    Most random processes studied in nonlinear time series analysis take values on sets endowed with a group structure, e.g., the real and rational numbers, and the integers. This fact allows to associate with each pair of group elements a third element, called their transcript, which is defined as the product of the second element in the pair times the first one. The transfer entropy of two such processes is called algebraic transfer entropy. It measures the information transferred between two coupled processes whose values belong to a group. In this paper, we show that, subject to one constraint, the algebraic transfer entropy matches the (in general, conditional) mutual information of certain transcripts with one variable less. This property has interesting practical applications, especially to the analysis of short time series. We also derive weak conditions for the 3-dimensional algebraic transfer entropy to yield the same coupling direction as the corresponding mutual information of transcripts. A related issue concerns the use of mutual information of transcripts to determine coupling directions in cases where the conditions just mentioned are not fulfilled. We checked the latter possibility in the lowest dimensional case with numerical simulations and cardiovascular data, and obtained positive results.

  15. A contour for the entanglement entropies in harmonic lattices

    NASA Astrophysics Data System (ADS)

    Coser, Andrea; De Nobili, Cristiano; Tonni, Erik

    2017-08-01

    We construct a contour function for the entanglement entropies in generic harmonic lattices. In one spatial dimension, numerical analysis are performed by considering harmonic chains with either periodic or Dirichlet boundary conditions. In the massless regime and for some configurations where the subsystem is a single interval, the numerical results for the contour function are compared to the inverse of the local weight function which multiplies the energy-momentum tensor in the corresponding entanglement hamiltonian, found through conformal field theory methods, and a good agreement is observed. A numerical analysis of the contour function for the entanglement entropy is performed also in a massless harmonic chain for a subsystem made by two disjoint intervals.

  16. Temperature variability analysis using wavelets and multiscale entropy in patients with systemic inflammatory response syndrome, sepsis, and septic shock.

    PubMed

    Papaioannou, Vasilios E; Chouvarda, Ioanna G; Maglaveras, Nikos K; Pneumatikos, Ioannis A

    2012-12-12

    Even though temperature is a continuous quantitative variable, its measurement has been considered a snapshot of a process, indicating whether a patient is febrile or afebrile. Recently, other diagnostic techniques have been proposed for the association between different properties of the temperature curve with severity of illness in the Intensive Care Unit (ICU), based on complexity analysis of continuously monitored body temperature. In this study, we tried to assess temperature complexity in patients with systemic inflammation during a suspected ICU-acquired infection, by using wavelets transformation and multiscale entropy of temperature signals, in a cohort of mixed critically ill patients. Twenty-two patients were enrolled in the study. In five, systemic inflammatory response syndrome (SIRS, group 1) developed, 10 had sepsis (group 2), and seven had septic shock (group 3). All temperature curves were studied during the first 24 hours of an inflammatory state. A wavelet transformation was applied, decomposing the signal in different frequency components (scales) that have been found to reflect neurogenic and metabolic inputs on temperature oscillations. Wavelet energy and entropy per different scales associated with complexity in specific frequency bands and multiscale entropy of the whole signal were calculated. Moreover, a clustering technique and a linear discriminant analysis (LDA) were applied for permitting pattern recognition in data sets and assessing diagnostic accuracy of different wavelet features among the three classes of patients. Statistically significant differences were found in wavelet entropy between patients with SIRS and groups 2 and 3, and in specific ultradian bands between SIRS and group 3, with decreased entropy in sepsis. Cluster analysis using wavelet features in specific bands revealed concrete clusters closely related with the groups in focus. LDA after wrapper-based feature selection was able to classify with an accuracy of more than 80% SIRS from the two sepsis groups, based on multiparametric patterns of entropy values in the very low frequencies and indicating reduced metabolic inputs on local thermoregulation, probably associated with extensive vasodilatation. We suggest that complexity analysis of temperature signals can assess inherent thermoregulatory dynamics during systemic inflammation and has increased discriminating value in patients with infectious versus noninfectious conditions, probably associated with severity of illness.

  17. Temperature variability analysis using wavelets and multiscale entropy in patients with systemic inflammatory response syndrome, sepsis, and septic shock

    PubMed Central

    2012-01-01

    Background Even though temperature is a continuous quantitative variable, its measurement has been considered a snapshot of a process, indicating whether a patient is febrile or afebrile. Recently, other diagnostic techniques have been proposed for the association between different properties of the temperature curve with severity of illness in the Intensive Care Unit (ICU), based on complexity analysis of continuously monitored body temperature. In this study, we tried to assess temperature complexity in patients with systemic inflammation during a suspected ICU-acquired infection, by using wavelets transformation and multiscale entropy of temperature signals, in a cohort of mixed critically ill patients. Methods Twenty-two patients were enrolled in the study. In five, systemic inflammatory response syndrome (SIRS, group 1) developed, 10 had sepsis (group 2), and seven had septic shock (group 3). All temperature curves were studied during the first 24 hours of an inflammatory state. A wavelet transformation was applied, decomposing the signal in different frequency components (scales) that have been found to reflect neurogenic and metabolic inputs on temperature oscillations. Wavelet energy and entropy per different scales associated with complexity in specific frequency bands and multiscale entropy of the whole signal were calculated. Moreover, a clustering technique and a linear discriminant analysis (LDA) were applied for permitting pattern recognition in data sets and assessing diagnostic accuracy of different wavelet features among the three classes of patients. Results Statistically significant differences were found in wavelet entropy between patients with SIRS and groups 2 and 3, and in specific ultradian bands between SIRS and group 3, with decreased entropy in sepsis. Cluster analysis using wavelet features in specific bands revealed concrete clusters closely related with the groups in focus. LDA after wrapper-based feature selection was able to classify with an accuracy of more than 80% SIRS from the two sepsis groups, based on multiparametric patterns of entropy values in the very low frequencies and indicating reduced metabolic inputs on local thermoregulation, probably associated with extensive vasodilatation. Conclusions We suggest that complexity analysis of temperature signals can assess inherent thermoregulatory dynamics during systemic inflammation and has increased discriminating value in patients with infectious versus noninfectious conditions, probably associated with severity of illness. PMID:22424316

  18. Event by event analysis and entropy of multiparticle systems

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Czyz, W.

    2000-04-01

    The coincidence method of measuring the entropy of a system, proposed some time ago by Ma, is generalized to include systems out of equilibrium. It is suggested that the method can be adapted to analyze multiparticle states produced in high-energy collisions.

  19. Correlation and agreement between the bispectral index vs. state entropy during hypothermic cardio-pulmonary bypass.

    PubMed

    Meybohm, P; Gruenewald, M; Höcker, J; Renner, J; Graesner, J-T; Ilies, C; Scholz, J; Bein, B

    2010-02-01

    The bispectral index (BIS) and spectral entropy enable monitoring the depth of anaesthesia. Mild hypothermia has been shown to affect the ability of electroencephalography monitors to reflect the anaesthetic drug effect. The purpose of this study was to investigate the effect of hypothermia during a cardio-pulmonary bypass on the correlation and agreement between the BIS and entropy variables compared with normothermic conditions. This prospective clinical study included coronary artery bypass grafting patients (n=25) evaluating correlation and agreement (Bland-Altman analysis) between the BIS and both spectral and response entropy during a hypothermic cardio-pulmonary bypass (31-34 degrees C) compared with nomothermic conditions (34-37.5 degrees C). Anaesthesia was maintained with propofol and sufentanil and adjusted clinically, while the anaesthetist was blinded to the monitors. The BIS and entropy values decreased during cooling (P<0.05), but the decrease was more pronounced for entropy variables compared with BIS (P<0.05). The correlation coefficients (bias+/-SD; percentage error) between the BIS vs. spectral state entropy and response entropy were r(2)=0.56 (1+/-11; 42%) and r(2)=0.58 (-2+/-11; 43%) under normothermic conditions, and r(2)=0.17 (10+/-12; 77%) and r(2)=0.18 (9+/-11; 68%) under hypothermic conditions, respectively. Bias was significantly increased under hypothermic conditions (P<0.001 vs. normothermia). Acceptable agreement was observed between the BIS and entropy variables under normothermic but not under hypothermic conditions. The BIS and entropy variables may therefore not be interchangeable during a hypothermic cardio-pulmonary bypass.

  20. Entropy Generation/Availability Energy Loss Analysis Inside MIT Gas Spring and "Two Space" Test Rigs

    NASA Technical Reports Server (NTRS)

    Ebiana, Asuquo B.; Savadekar, Rupesh T.; Patel, Kaushal V.

    2006-01-01

    The results of the entropy generation and availability energy loss analysis under conditions of oscillating pressure and oscillating helium gas flow in two Massachusetts Institute of Technology (MIT) test rigs piston-cylinder and piston-cylinder-heat exchanger are presented. Two solution domains, the gas spring (single-space) in the piston-cylinder test rig and the gas spring + heat exchanger (two-space) in the piston-cylinder-heat exchanger test rig are of interest. Sage and CFD-ACE+ commercial numerical codes are used to obtain 1-D and 2-D computer models, respectively, of each of the two solution domains and to simulate the oscillating gas flow and heat transfer effects in these domains. Second law analysis is used to characterize the entropy generation and availability energy losses inside the two solution domains. Internal and external entropy generation and availability energy loss results predicted by Sage and CFD-ACE+ are compared. Thermodynamic loss analysis of simple systems such as the MIT test rigs are often useful to understand some important features of complex pattern forming processes in more complex systems like the Stirling engine. This study is aimed at improving numerical codes for the prediction of thermodynamic losses via the development of a loss post-processor. The incorporation of loss post-processors in Stirling engine numerical codes will facilitate Stirling engine performance optimization. Loss analysis using entropy-generation rates due to heat and fluid flow is a relatively new technique for assessing component performance. It offers a deep insight into the flow phenomena, allows a more exact calculation of losses than is possible with traditional means involving the application of loss correlations and provides an effective tool for improving component and overall system performance.

  1. Time-series analysis of multiple foreign exchange rates using time-dependent pattern entropy

    NASA Astrophysics Data System (ADS)

    Ishizaki, Ryuji; Inoue, Masayoshi

    2018-01-01

    Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in multiple foreign exchange rates. The time-dependent pattern entropy of 7 foreign exchange rates (AUD/USD, CAD/USD, CHF/USD, EUR/USD, GBP/USD, JPY/USD, and NZD/USD) was found to be high in the long period after the Lehman shock, and be low in the long period after Mar 2012. We compared the correlation matrix between exchange rates in periods of high and low of the time-dependent pattern entropy.

  2. Horizon Entropy from Quantum Gravity Condensates.

    PubMed

    Oriti, Daniele; Pranzetti, Daniele; Sindoni, Lorenzo

    2016-05-27

    We construct condensate states encoding the continuum spherically symmetric quantum geometry of a horizon in full quantum gravity, i.e., without any classical symmetry reduction, in the group field theory formalism. Tracing over the bulk degrees of freedom, we show how the resulting reduced density matrix manifestly exhibits a holographic behavior. We derive a complete orthonormal basis of eigenstates for the reduced density matrix of the horizon and use it to compute the horizon entanglement entropy. By imposing consistency with the horizon boundary conditions and semiclassical thermodynamical properties, we recover the Bekenstein-Hawking entropy formula for any value of the Immirzi parameter. Our analysis supports the equivalence between the von Neumann (entanglement) entropy interpretation and the Boltzmann (statistical) one.

  3. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  4. On the relationship between NMR-derived amide order parameters and protein backbone entropy changes

    PubMed Central

    Sharp, Kim A.; O’Brien, Evan; Kasinath, Vignesh; Wand, A. Joshua

    2015-01-01

    Molecular dynamics simulations are used to analyze the relationship between NMR-derived squared generalized order parameters of amide NH groups and backbone entropy. Amide order parameters (O2NH) are largely determined by the secondary structure and average values appear unrelated to the overall flexibility of the protein. However, analysis of the more flexible subset (O2NH < 0.8) shows that these report both on the local flexibility of the protein and on a different component of the conformational entropy than that reported by the side chain methyl axis order parameters, O2axis. A calibration curve for backbone entropy vs. O2NH is developed which accounts for both correlations between amide group motions of different residues, and correlations between backbone and side chain motions. This calibration curve can be used with experimental values of O2NH changes obtained by NMR relaxation measurements to extract backbone entropy changes, e.g. upon ligand binding. In conjunction with our previous calibration for side chain entropy derived from measured O2axis values this provides a prescription for determination of the total protein conformational entropy changes from NMR relaxation measurements. PMID:25739366

  5. Entropy Growth in the Early Universe and Confirmation of Initial Big Bang Conditions

    NASA Astrophysics Data System (ADS)

    Beckwith, Andrew

    2009-09-01

    This paper shows how increased entropy values from an initially low big bang level can be measured experimentally by counting relic gravitons. Furthermore the physical mechanism of this entropy increase is explained via analogies with early-universe phase transitions. The role of Jack Ng's (2007, 2008a, 2008b) revised infinite quantum statistics in the physics of gravitational wave detection is acknowledged. Ng's infinite quantum statistics can be used to show that ΔS~ΔNgravitons is a startmg point to the increasing net universe cosmological entropy. Finally, in a nod to similarities AS ZPE analysis, it is important to note that the resulting ΔS~ΔNgravitons ≠ 1088, that in fact it is much lower, allowing for evaluating initial graviton production as an emergent field phenomena, which may be similar to how ZPE states can be used to extract energy from a vacuum if entropy is not maximized. The rapid increase in entropy so alluded to without near sudden increases to 1088 may be enough to allow successful modeling of relic graviton production for entropy in a manner similar to ZPE energy extraction from a vacuum state.

  6. On the relationship between NMR-derived amide order parameters and protein backbone entropy changes.

    PubMed

    Sharp, Kim A; O'Brien, Evan; Kasinath, Vignesh; Wand, A Joshua

    2015-05-01

    Molecular dynamics simulations are used to analyze the relationship between NMR-derived squared generalized order parameters of amide NH groups and backbone entropy. Amide order parameters (O(2) NH ) are largely determined by the secondary structure and average values appear unrelated to the overall flexibility of the protein. However, analysis of the more flexible subset (O(2) NH  < 0.8) shows that these report both on the local flexibility of the protein and on a different component of the conformational entropy than that reported by the side chain methyl axis order parameters, O(2) axis . A calibration curve for backbone entropy vs. O(2) NH is developed, which accounts for both correlations between amide group motions of different residues, and correlations between backbone and side chain motions. This calibration curve can be used with experimental values of O(2) NH changes obtained by NMR relaxation measurements to extract backbone entropy changes, for example, upon ligand binding. In conjunction with our previous calibration for side chain entropy derived from measured O(2) axis values this provides a prescription for determination of the total protein conformational entropy changes from NMR relaxation measurements. © 2015 Wiley Periodicals, Inc.

  7. Thermodynamic contribution of backbone conformational entropy in the binding between SH3 domain and proline-rich motif.

    PubMed

    Zeng, Danyun; Shen, Qingliang; Cho, Jae-Hyun

    2017-02-26

    Biological functions of intrinsically disordered proteins (IDPs), and proteins containing intrinsically disordered regions (IDRs) are often mediated by short linear motifs, like proline-rich motifs (PRMs). Upon binding to their target proteins, IDPs undergo a disorder-to-order transition which is accompanied by a large conformational entropy penalty. Hence, the molecular mechanisms underlying control of conformational entropy are critical for understanding the binding affinity and selectivity of IDPs-mediated protein-protein interactions (PPIs). Here, we investigated the backbone conformational entropy change accompanied by binding of the N-terminal SH3 domain (nSH3) of CrkII and PRM derived from guanine nucleotide exchange factor 1 (C3G). In particular, we focused on the estimation of conformational entropy change of disordered PRM upon binding to the nSH3 domain. Quantitative characterization of conformational dynamics of disordered peptides like PRMs is limited. Hence, we combined various methods, including NMR model-free analysis, δ2D, DynaMine, and structure-based calculation of entropy loss. This study demonstrates that the contribution of backbone conformational entropy change is significant in the PPIs mediated by IDPs/IDRs. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Heat Transfer and Entropy Generation Analysis of an Intermediate Heat Exchanger in ADS

    NASA Astrophysics Data System (ADS)

    Wang, Yongwei; Huai, Xiulan

    2018-04-01

    The intermediate heat exchanger for enhancement heat transfer is the important equipment in the usage of nuclear energy. In the present work, heat transfer and entropy generation of an intermediate heat exchanger (IHX) in the accelerator driven subcritical system (ADS) are investigated experimentally. The variation of entropy generation number with performance parameters of the IHX is analyzed, and effects of inlet conditions of the IHX on entropy generation number and heat transfer are discussed. Compared with the results at two working conditions of the constant mass flow rates of liquid lead-bismuth eutectic (LBE) and helium gas, the total pumping power all tends to reduce with the decreasing entropy generation number, but the variations of the effectiveness, number of transfer units and thermal capacity rate ratio are inconsistent, and need to analyze respectively. With the increasing inlet mass flow rate or LBE inlet temperature, the entropy generation number increases and the heat transfer is enhanced, while the opposite trend occurs with the increasing helium gas inlet temperature. The further study is necessary for obtaining the optimized operation parameters of the IHX to minimize entropy generation and enhance heat transfer.

  9. Relativistic three-dimensional Lippmann-Schwinger cross sections for space radiation applications

    NASA Astrophysics Data System (ADS)

    Werneth, C. M.; Xu, X.; Norman, R. B.; Maung, K. M.

    2017-12-01

    Radiation transport codes require accurate nuclear cross sections to compute particle fluences inside shielding materials. The Tripathi semi-empirical reaction cross section, which includes over 60 parameters tuned to nucleon-nucleus (NA) and nucleus-nucleus (AA) data, has been used in many of the world's best-known transport codes. Although this parameterization fits well to reaction cross section data, the predictive capability of any parameterization is questionable when it is used beyond the range of the data to which it was tuned. Using uncertainty analysis, it is shown that a relativistic three-dimensional Lippmann-Schwinger (LS3D) equation model based on Multiple Scattering Theory (MST) that uses 5 parameterizations-3 fundamental parameterizations to nucleon-nucleon (NN) data and 2 nuclear charge density parameterizations-predicts NA and AA reaction cross sections as well as the Tripathi cross section parameterization for reactions in which the kinetic energy of the projectile in the laboratory frame (TLab) is greater than 220 MeV/n. The relativistic LS3D model has the additional advantage of being able to predict highly accurate total and elastic cross sections. Consequently, it is recommended that the relativistic LS3D model be used for space radiation applications in which TLab > 220MeV /n .

  10. Influence of conversion on the location of points and lines: The change of location entropy and the probability of a vector point inside the converted grid point

    NASA Astrophysics Data System (ADS)

    Chen, Nan

    2018-03-01

    Conversion of points or lines from vector to grid format, or vice versa, is the first operation required for most spatial analysis. Conversion, however, usually causes the location of points or lines to change, which influences the reliability of the results of spatial analysis or even results in analysis errors. The purpose of this paper is to evaluate the change of the location of points and lines during conversion using the concepts of probability and entropy. This paper shows that when a vector point is converted to a grid point, the vector point may be outside or inside the grid point. This paper deduces a formula for computing the probability that the vector point is inside the grid point. It was found that the probability increased with the side length of the grid and with the variances of the coordinates of the vector point. In addition, the location entropy of points and lines are defined in this paper. Formulae for computing the change of the location entropy during conversion are deduced. The probability mentioned above and the change of location entropy may be used to evaluate the location reliability of points and lines in Geographic Information Systems and may be used to choose an appropriate range of the side length of grids before conversion. The results of this study may help scientists and users to avoid mistakes caused by the change of location during conversion as well as in spatial decision and analysis.

  11. Antipsychotics reverse abnormal EEG complexity in drug-naïve schizophrenia: A multiscale entropy analysis

    PubMed Central

    Takahashi, Tetsuya; Cho, Raymond Y.; Mizuno, Tomoyuki; Kikuchi, Mitsuru; Murata, Tetsuhito; Takahashi, Koichi; Wada, Yuji

    2010-01-01

    Multiscale entropy (MSE) analysis is a novel entropy-based approach for measuring dynamical complexity in physiological systems over a range of temporal scales. To evaluate this analytic approach as an aid to elucidating the pathophysiologic mechanisms in schizophrenia, we examined MSE in EEG activity in drug-naïve schizophrenia subjects pre- and post-treatment with antipsychotics in comparison with traditional EEG analysis. We recorded eyes-closed resting state EEG from frontal, temporal, parietal and occipital regions in drug-naïve 22 schizophrenia and 24 age-matched healthy control subjects. Fifteen patients were re-evaluated within 2–8 weeks after the initiation of antipsychotic treatment. For each participant, MSE was calculated on one continuous 60 second epoch for each experimental session. Schizophrenia subjects showed significantly higher complexity at higher time scales (lower frequencies), than that of healthy controls in fronto-centro-temporal, but not in parieto-occipital regions. Post-treatment, this higher complexity decreased to healthy control subject levels selectively in fronto-central regions, while the increased complexity in temporal sites remained higher. Comparative power analysis identified spectral slowing in frontal regions in pre-treatment schizophrenia subjects, consistent with previous findings, whereas no antipsychotic treatment effect was observed. In summary, multiscale entropy measures identified abnormal dynamical EEG signal complexity in anterior brain areas in schizophrenia that normalized selectively in fronto-central areas with antipsychotic treatment. These findings show that entropy-based analytic methods may serve as a novel approach for characterizing and understanding abnormal cortical dynamics in schizophrenia, and elucidating the therapeutic mechanisms of antipsychotics. PMID:20149880

  12. Entropy is in Flux V3.4

    NASA Astrophysics Data System (ADS)

    Kadanoff, Leo P.

    2017-05-01

    The science of thermodynamics was put together in the Nineteenth Century to describe large systems in equilibrium. One part of thermodynamics defines entropy for equilibrium systems and demands an ever-increasing entropy for non-equilibrium ones. Since thermodynamics does not define entropy out of equilibrium, pure thermodynamics cannot follow the details of how this increase occurs. However, starting with the work of Ludwig Boltzmann in 1872, and continuing to the present day, various models of non-equilibrium behavior have been put together with the specific aim of generalizing the concept of entropy to non-equilibrium situations. This kind of entropy has been termed kinetic entropy to distinguish it from the thermodynamic variety. Knowledge of kinetic entropy started from Boltzmann's insight about his equation for the time dependence of gaseous systems. In this paper, his result is stated as a definition of kinetic entropy in terms of a local equation for the entropy density. This definition is then applied to Landau's theory of the Fermi liquid thereby giving the kinetic entropy within that theory. The dynamics of many condensed matter systems including Fermi liquids, low temperature superfluids, and ordinary metals lend themselves to the definition of kinetic entropy. In fact, entropy has been defined and used for a wide variety of situations in which a condensed matter system has been allowed to relax for a sufficient period so that the very most rapid fluctuations have been ironed out. One of the broadest applications of non-equilibrium analysis considers quantum degenerate systems using Martin-Schwinger Green's functions (Phys Rev 115:1342-1373, 1959) as generalized Wigner functions, g^<({p},ω ,{R},T) and g^>({p},ω ,{R},T). This paper describes once again how the quantum kinetic equations for these functions give locally defined conservation laws for mass momentum and energy. In local thermodynamic equilibrium, this kinetic theory enables a reasonable definition of the density of kinetic entropy. However, when the system is outside of local equilibrium, this definition fails. It is speculated that quantum entanglement is the source of this failure.

  13. A software platform for statistical evaluation of patient respiratory patterns in radiation therapy.

    PubMed

    Dunn, Leon; Kenny, John

    2017-10-01

    The aim of this work was to design and evaluate a software tool for analysis of a patient's respiration, with the goal of optimizing the effectiveness of motion management techniques during radiotherapy imaging and treatment. A software tool which analyses patient respiratory data files (.vxp files) created by the Varian Real-Time Position Management System (RPM) was developed to analyse patient respiratory data. The software, called RespAnalysis, was created in MATLAB and provides four modules, one each for determining respiration characteristics, providing breathing coaching (biofeedback training), comparing pre and post-training characteristics and performing a fraction-by-fraction assessment. The modules analyse respiratory traces to determine signal characteristics and specifically use a Sample Entropy algorithm as the key means to quantify breathing irregularity. Simulated respiratory signals, as well as 91 patient RPM traces were analysed with RespAnalysis to test the viability of using the Sample Entropy for predicting breathing regularity. Retrospective assessment of patient data demonstrated that the Sample Entropy metric was a predictor of periodic irregularity in respiration data, however, it was found to be insensitive to amplitude variation. Additional waveform statistics assessing the distribution of signal amplitudes over time coupled with Sample Entropy method were found to be useful in assessing breathing regularity. The RespAnalysis software tool presented in this work uses the Sample Entropy method to analyse patient respiratory data recorded for motion management purposes in radiation therapy. This is applicable during treatment simulation and during subsequent treatment fractions, providing a way to quantify breathing irregularity, as well as assess the need for breathing coaching. It was demonstrated that the Sample Entropy metric was correlated to the irregularity of the patient's respiratory motion in terms of periodicity, whilst other metrics, such as percentage deviation of inhale/exhale peak positions provided insight into respiratory amplitude regularity. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Zhiming; Radboud University/NIKHEF, NL-6525 ED Nijmegen

    We report on an entropy analysis using Ma's coincidence method on {pi}+p and K+p collisions at {radical}(s) = 22 GeV. A scaling law and additivity properties of Renyi entropies and their charged-particle multiplicity dependence are investigated. The results are compared with those from the PYTHIA Monte Carlo model.

  15. Mathematical model for thermal and entropy analysis of thermal solar collectors by using Maxwell nanofluids with slip conditions, thermal radiation and variable thermal conductivity

    NASA Astrophysics Data System (ADS)

    Aziz, Asim; Jamshed, Wasim; Aziz, Taha

    2018-04-01

    In the present research a simplified mathematical model for the solar thermal collectors is considered in the form of non-uniform unsteady stretching surface. The non-Newtonian Maxwell nanofluid model is utilized for the working fluid along with slip and convective boundary conditions and comprehensive analysis of entropy generation in the system is also observed. The effect of thermal radiation and variable thermal conductivity are also included in the present model. The mathematical formulation is carried out through a boundary layer approach and the numerical computations are carried out for Cu-water and TiO2-water nanofluids. Results are presented for the velocity, temperature and entropy generation profiles, skin friction coefficient and Nusselt number. The discussion is concluded on the effect of various governing parameters on the motion, temperature variation, entropy generation, velocity gradient and the rate of heat transfer at the boundary.

  16. Refined composite multivariate generalized multiscale fuzzy entropy: A tool for complexity analysis of multichannel signals

    NASA Astrophysics Data System (ADS)

    Azami, Hamed; Escudero, Javier

    2017-01-01

    Multiscale entropy (MSE) is an appealing tool to characterize the complexity of time series over multiple temporal scales. Recent developments in the field have tried to extend the MSE technique in different ways. Building on these trends, we propose the so-called refined composite multivariate multiscale fuzzy entropy (RCmvMFE) whose coarse-graining step uses variance (RCmvMFEσ2) or mean (RCmvMFEμ). We investigate the behavior of these multivariate methods on multichannel white Gaussian and 1/ f noise signals, and two publicly available biomedical recordings. Our simulations demonstrate that RCmvMFEσ2 and RCmvMFEμ lead to more stable results and are less sensitive to the signals' length in comparison with the other existing multivariate multiscale entropy-based methods. The classification results also show that using both the variance and mean in the coarse-graining step offers complexity profiles with complementary information for biomedical signal analysis. We also made freely available all the Matlab codes used in this paper.

  17. A study of entropy/clarity of genetic sequences using metric spaces and fuzzy sets.

    PubMed

    Georgiou, D N; Karakasidis, T E; Nieto, Juan J; Torres, A

    2010-11-07

    The study of genetic sequences is of great importance in biology and medicine. Sequence analysis and taxonomy are two major fields of application of bioinformatics. In the present paper we extend the notion of entropy and clarity to the use of different metrics and apply them in the case of the Fuzzy Polynuclotide Space (FPS). Applications of these notions on selected polynucleotides and complete genomes both in the I(12×k) space, but also using their representation in FPS are presented. Our results show that the values of fuzzy entropy/clarity are indicative of the degree of complexity necessary for the description of the polynucleotides in the FPS, although in the latter case the interpretation is slightly different than in the case of the I(12×k) hypercube. Fuzzy entropy/clarity along with the use of appropriate metrics can contribute to sequence analysis and taxonomy. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  19. A Robust Parameterization of Human Gait Patterns Across Phase-Shifting Perturbations

    PubMed Central

    Villarreal, Dario J.; Poonawala, Hasan A.; Gregg, Robert D.

    2016-01-01

    The phase of human gait is difficult to quantify accurately in the presence of disturbances. In contrast, recent bipedal robots use time-independent controllers relying on a mechanical phase variable to synchronize joint patterns through the gait cycle. This concept has inspired studies to determine if human joint patterns can also be parameterized by a mechanical variable. Although many phase variable candidates have been proposed, it remains unclear which, if any, provide a robust representation of phase for human gait analysis or control. In this paper we analytically derive an ideal phase variable (the hip phase angle) that is provably monotonic and bounded throughout the gait cycle. To examine the robustness of this phase variable, ten able-bodied human subjects walked over a platform that randomly applied phase-shifting perturbations to the stance leg. A statistical analysis found the correlations between nominal and perturbed joint trajectories to be significantly greater when parameterized by the hip phase angle (0.95+) than by time or a different phase variable. The hip phase angle also best parameterized the transient errors about the nominal periodic orbit. Finally, interlimb phasing was best explained by local (ipsilateral) hip phase angles that are synchronized during the double-support period. PMID:27187967

  20. Generalized permutation entropy analysis based on the two-index entropic form S q , δ

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian

    2015-05-01

    Permutation entropy (PE) is a novel measure to quantify the complexity of nonlinear time series. In this paper, we propose a generalized permutation entropy ( P E q , δ ) based on the recently postulated entropic form, S q , δ , which was proposed as an unification of the well-known Sq of nonextensive-statistical mechanics and S δ , a possibly appropriate candidate for the black-hole entropy. We find that P E q , δ with appropriate parameters can amplify minor changes and trends of complexities in comparison to PE. Experiments with this generalized permutation entropy method are performed with both synthetic and stock data showing its power. Results show that P E q , δ is an exponential function of q and the power ( k ( δ ) ) is a constant if δ is determined. Some discussions about k ( δ ) are provided. Besides, we also find some interesting results about power law.

  1. MoNbTaV Medium-Entropy Alloy

    DOE PAGES

    Yao, Hongwei; Qiao, Jun -Wei; Gao, Michael; ...

    2016-05-19

    Guided by CALPHAD (Calculation of Phase Diagrams) modeling, the refractory medium-entropy alloy MoNbTaV was synthesized by vacuum arc melting under a high-purity argon atmosphere. A body-centered cubic solid solution phase was experimentally confirmed in the as-cast ingot using X-ray diffraction and scanning electron microscopy. The measured lattice parameter of the alloy (3.208 Å) obeys the rule of mixtures (ROM), but the Vickers microhardness (4.95 GPa) and the yield strength (1.5 GPa) are about 4.5 and 4.6 times those estimated from the ROM, respectively. Using a simple model on solid solution strengthening predicts a yield strength of approximately 1.5 GPa. Inmore » conclusion, thermodynamic analysis shows that the total entropy of the alloy is more than three times the configurational entropy at room temperature, and the entropy of mixing exhibits a small negative departure from ideal mixing.« less

  2. Domain-averaged snow depth over complex terrain from flat field measurements

    NASA Astrophysics Data System (ADS)

    Helbig, Nora; van Herwijnen, Alec

    2017-04-01

    Snow depth is an important parameter for a variety of coarse-scale models and applications, such as hydrological forecasting. Since high-resolution snow cover models are computational expensive, simplified snow models are often used. Ground measured snow depth at single stations provide a chance for snow depth data assimilation to improve coarse-scale model forecasts. Snow depth is however commonly recorded at so-called flat fields, often in large measurement networks. While these ground measurement networks provide a wealth of information, various studies questioned the representativity of such flat field snow depth measurements for the surrounding topography. We developed two parameterizations to compute domain-averaged snow depth for coarse model grid cells over complex topography using easy to derive topographic parameters. To derive the two parameterizations we performed a scale dependent analysis for domain sizes ranging from 50m to 3km using highly-resolved snow depth maps at the peak of winter from two distinct climatic regions in Switzerland and in the Spanish Pyrenees. The first, simpler parameterization uses a commonly applied linear lapse rate. For the second parameterization, we first removed the obvious elevation gradient in mean snow depth, which revealed an additional correlation with the subgrid sky view factor. We evaluated domain-averaged snow depth derived with both parameterizations using flat field measurements nearby with the domain-averaged highly-resolved snow depth. This revealed an overall improved performance for the parameterization combining a power law elevation trend scaled with the subgrid parameterized sky view factor. We therefore suggest the parameterization could be used to assimilate flat field snow depth into coarse-scale snow model frameworks in order to improve coarse-scale snow depth estimates over complex topography.

  3. Noise and Complexity in Human Postural Control: Interpreting the Different Estimations of Entropy

    PubMed Central

    Rhea, Christopher K.; Silver, Tobin A.; Hong, S. Lee; Ryu, Joong Hyun; Studenka, Breanna E.; Hughes, Charmayne M. L.; Haddad, Jeffrey M.

    2011-01-01

    Background Over the last two decades, various measures of entropy have been used to examine the complexity of human postural control. In general, entropy measures provide information regarding the health, stability and adaptability of the postural system that is not captured when using more traditional analytical techniques. The purpose of this study was to examine how noise, sampling frequency and time series length influence various measures of entropy when applied to human center of pressure (CoP) data, as well as in synthetic signals with known properties. Such a comparison is necessary to interpret data between and within studies that use different entropy measures, equipment, sampling frequencies or data collection durations. Methods and Findings The complexity of synthetic signals with known properties and standing CoP data was calculated using Approximate Entropy (ApEn), Sample Entropy (SampEn) and Recurrence Quantification Analysis Entropy (RQAEn). All signals were examined at varying sampling frequencies and with varying amounts of added noise. Additionally, an increment time series of the original CoP data was examined to remove long-range correlations. Of the three measures examined, ApEn was the least robust to sampling frequency and noise manipulations. Additionally, increased noise led to an increase in SampEn, but a decrease in RQAEn. Thus, noise can yield inconsistent results between the various entropy measures. Finally, the differences between the entropy measures were minimized in the increment CoP data, suggesting that long-range correlations should be removed from CoP data prior to calculating entropy. Conclusions The various algorithms typically used to quantify the complexity (entropy) of CoP may yield very different results, particularly when sampling frequency and noise are different. The results of this study are discussed within the context of the neural noise and loss of complexity hypotheses. PMID:21437281

  4. An entropy-based analysis of lane changing behavior: An interactive approach.

    PubMed

    Kosun, Caglar; Ozdemir, Serhan

    2017-05-19

    As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would match discretionary lane changing behavior. This article states that driver behaviors would be in the nonadditive entropy domain to provide a safe traffic stream and hence with vehicle accident prevention in mind.

  5. The impact of structural uncertainty on cost-effectiveness models for adjuvant endocrine breast cancer treatments: the need for disease-specific model standardization and improved guidance.

    PubMed

    Frederix, Gerardus W J; van Hasselt, Johan G C; Schellens, Jan H M; Hövels, Anke M; Raaijmakers, Jan A M; Huitema, Alwin D R; Severens, Johan L

    2014-01-01

    Structural uncertainty relates to differences in model structure and parameterization. For many published health economic analyses in oncology, substantial differences in model structure exist, leading to differences in analysis outcomes and potentially impacting decision-making processes. The objectives of this analysis were (1) to identify differences in model structure and parameterization for cost-effectiveness analyses (CEAs) comparing tamoxifen and anastrazole for adjuvant breast cancer (ABC) treatment; and (2) to quantify the impact of these differences on analysis outcome metrics. The analysis consisted of four steps: (1) review of the literature for identification of eligible CEAs; (2) definition and implementation of a base model structure, which included the core structural components for all identified CEAs; (3) definition and implementation of changes or additions in the base model structure or parameterization; and (4) quantification of the impact of changes in model structure or parameterizations on the analysis outcome metrics life-years gained (LYG), incremental costs (IC) and the incremental cost-effectiveness ratio (ICER). Eleven CEA analyses comparing anastrazole and tamoxifen as ABC treatment were identified. The base model consisted of the following health states: (1) on treatment; (2) off treatment; (3) local recurrence; (4) metastatic disease; (5) death due to breast cancer; and (6) death due to other causes. The base model estimates of anastrazole versus tamoxifen for the LYG, IC and ICER were 0.263 years, €3,647 and €13,868/LYG, respectively. In the published models that were evaluated, differences in model structure included the addition of different recurrence health states, and associated transition rates were identified. Differences in parameterization were related to the incidences of recurrence, local recurrence to metastatic disease, and metastatic disease to death. The separate impact of these model components on the LYG ranged from 0.207 to 0.356 years, while incremental costs ranged from €3,490 to €3,714 and ICERs ranged from €9,804/LYG to €17,966/LYG. When we re-analyzed the published CEAs in our framework by including their respective model properties, the LYG ranged from 0.207 to 0.383 years, IC ranged from €3,556 to €3,731 and ICERs ranged from €9,683/LYG to €17,570/LYG. Differences in model structure and parameterization lead to substantial differences in analysis outcome metrics. This analysis supports the need for more guidance regarding structural uncertainty and the use of standardized disease-specific models for health economic analyses of adjuvant endocrine breast cancer therapies. The developed approach in the current analysis could potentially serve as a template for further evaluations of structural uncertainty and development of disease-specific models.

  6. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    PubMed Central

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems. PMID:25068489

  7. Perceptual suppression revealed by adaptive multi-scale entropy analysis of local field potential in monkey visual cortex.

    PubMed

    Hu, Meng; Liang, Hualou

    2013-04-01

    Generalized flash suppression (GFS), in which a salient visual stimulus can be rendered invisible despite continuous retinal input, provides a rare opportunity to directly study the neural mechanism of visual perception. Previous work based on linear methods, such as spectral analysis, on local field potential (LFP) during GFS has shown that the LFP power at distinctive frequency bands are differentially modulated by perceptual suppression. Yet, the linear method alone may be insufficient for the full assessment of neural dynamic due to the fundamentally nonlinear nature of neural signals. In this study, we set forth to analyze the LFP data collected from multiple visual areas in V1, V2 and V4 of macaque monkeys while performing the GFS task using a nonlinear method - adaptive multi-scale entropy (AME) - to reveal the neural dynamic of perceptual suppression. In addition, we propose a new cross-entropy measure at multiple scales, namely adaptive multi-scale cross-entropy (AMCE), to assess the nonlinear functional connectivity between two cortical areas. We show that: (1) multi-scale entropy exhibits percept-related changes in all three areas, with higher entropy observed during perceptual suppression; (2) the magnitude of the perception-related entropy changes increases systematically over successive hierarchical stages (i.e. from lower areas V1 to V2, up to higher area V4); and (3) cross-entropy between any two cortical areas reveals higher degree of asynchrony or dissimilarity during perceptual suppression, indicating a decreased functional connectivity between cortical areas. These results, taken together, suggest that perceptual suppression is related to a reduced functional connectivity and increased uncertainty of neural responses, and the modulation of perceptual suppression is more effective at higher visual cortical areas. AME is demonstrated to be a useful technique in revealing the underlying dynamic of nonlinear/nonstationary neural signal.

  8. Impacts of spectral nudging on the simulated surface air temperature in summer compared with the selection of shortwave radiation and land surface model physics parameterization in a high-resolution regional atmospheric model

    NASA Astrophysics Data System (ADS)

    Park, Jun; Hwang, Seung-On

    2017-11-01

    The impact of a spectral nudging technique for the dynamical downscaling of the summer surface air temperature in a high-resolution regional atmospheric model is assessed. The performance of this technique is measured by comparing 16 analysis-driven simulation sets of physical parameterization combinations of two shortwave radiation and four land surface model schemes of the model, which are known to be crucial for the simulation of the surface air temperature. It is found that the application of spectral nudging to the outermost domain has a greater impact on the regional climate than any combination of shortwave radiation and land surface model physics schemes. The optimal choice of two model physics parameterizations is helpful for obtaining more realistic spatiotemporal distributions of land surface variables such as the surface air temperature, precipitation, and surface fluxes. However, employing spectral nudging adds more value to the results; the improvement is greater than using sophisticated shortwave radiation and land surface model physical parameterizations. This result indicates that spectral nudging applied to the outermost domain provides a more accurate lateral boundary condition to the innermost domain when forced by analysis data by securing the consistency with large-scale forcing over a regional domain. This consequently indirectly helps two physical parameterizations to produce small-scale features closer to the observed values, leading to a better representation of the surface air temperature in a high-resolution downscaled climate.

  9. Optimization of pressure gauge locations for water distribution systems using entropy theory.

    PubMed

    Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon

    2012-12-01

    It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.

  10. Music viewed by its entropy content: A novel window for comparative analysis.

    PubMed

    Febres, Gerardo; Jaffe, Klaus

    2017-01-01

    Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the '2nd Order Entropy'. Applying these methods to a variety of musical pieces showed how the space of 'symbolic specific diversity-entropy' and that of '2nd order entropy' captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning.

  11. Safety Assessment of Dangerous Goods Transport Enterprise Based on the Relative Entropy Aggregation in Group Decision Making Model

    PubMed Central

    Wu, Jun; Li, Chengbing; Huo, Yueying

    2014-01-01

    Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises. PMID:25477954

  12. Safety assessment of dangerous goods transport enterprise based on the relative entropy aggregation in group decision making model.

    PubMed

    Wu, Jun; Li, Chengbing; Huo, Yueying

    2014-01-01

    Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises.

  13. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    USGS Publications Warehouse

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  14. Entropy Generation Minimization in Dimethyl Ether Synthesis: A Case Study

    NASA Astrophysics Data System (ADS)

    Kingston, Diego; Razzitte, Adrián César

    2018-04-01

    Entropy generation minimization is a method that helps improve the efficiency of real processes and devices. In this article, we study the entropy production (due to chemical reactions, heat exchange and friction) in a conventional reactor that synthesizes dimethyl ether and minimize it by modifying different operating variables of the reactor, such as composition, temperature and pressure, while aiming at a fixed production of dimethyl ether. Our results indicate that it is possible to reduce the entropy production rate by nearly 70 % and that, by changing only the inlet composition, it is possible to cut it by nearly 40 %, though this comes at the expense of greater dissipation due to heat transfer. We also study the alternative of coupling the reactor with another, where dehydrogenation of methylcyclohexane takes place. In that case, entropy generation can be reduced by 54 %, when pressure, temperature and inlet molar flows are varied. These examples show that entropy generation analysis can be a valuable tool in engineering design and applications aiming at process intensification and efficient operation of plant equipment.

  15. Entropy information of heart rate variability and its power spectrum during day and night

    NASA Astrophysics Data System (ADS)

    Jin, Li; Jun, Wang

    2013-07-01

    Physiologic systems generate complex fluctuations in their output signals that reflect the underlying dynamics. We employed the base-scale entropy method and the power spectral analysis to study the 24 hours heart rate variability (HRV) signals. The results show that such profound circadian-, age- and pathologic-dependent changes are accompanied by changes in base-scale entropy and power spectral distribution. Moreover, the base-scale entropy changes reflect the corresponding changes in the autonomic nerve outflow. With the suppression of the vagal tone and dominance of the sympathetic tone in congestive heart failure (CHF) subjects, there is more variability in the date fluctuation mode. So the higher base-scale entropy belongs to CHF subjects. With the decrease of the sympathetic tone and the respiratory frequency (RSA) becoming more pronounced with slower breathing during sleeping, the base-scale entropy drops in CHF subjects. The HRV series of the two healthy groups have the same diurnal/nocturnal trend as the CHF series. The fluctuation dynamics trend of data in the three groups can be described as “HF effect”.

  16. The importance of parameterization when simulating the hydrologic response of vegetative land-cover change

    NASA Astrophysics Data System (ADS)

    White, Jeremy; Stengel, Victoria; Rendon, Samuel; Banta, John

    2017-08-01

    Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash-Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.

  17. The importance of parameterization when simulating the hydrologic response of vegetative land-cover change

    USGS Publications Warehouse

    White, Jeremy; Stengel, Victoria G.; Rendon, Samuel H.; Banta, John

    2017-01-01

    Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash–Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.

  18. Characterizing Brain Structures and Remodeling after TBI Based on Information Content, Diffusion Entropy

    PubMed Central

    Fozouni, Niloufar; Chopp, Michael; Nejad-Davarani, Siamak P.; Zhang, Zheng Gang; Lehman, Norman L.; Gu, Steven; Ueno, Yuji; Lu, Mei; Ding, Guangliang; Li, Lian; Hu, Jiani; Bagher-Ebadian, Hassan; Hearshen, David; Jiang, Quan

    2013-01-01

    Background To overcome the limitations of conventional diffusion tensor magnetic resonance imaging resulting from the assumption of a Gaussian diffusion model for characterizing voxels containing multiple axonal orientations, Shannon's entropy was employed to evaluate white matter structure in human brain and in brain remodeling after traumatic brain injury (TBI) in a rat. Methods Thirteen healthy subjects were investigated using a Q-ball based DTI data sampling scheme. FA and entropy values were measured in white matter bundles, white matter fiber crossing areas, different gray matter (GM) regions and cerebrospinal fluid (CSF). Axonal densities' from the same regions of interest (ROIs) were evaluated in Bielschowsky and Luxol fast blue stained autopsy (n = 30) brain sections by light microscopy. As a case demonstration, a Wistar rat subjected to TBI and treated with bone marrow stromal cells (MSC) 1 week after TBI was employed to illustrate the superior ability of entropy over FA in detecting reorganized crossing axonal bundles as confirmed by histological analysis with Bielschowsky and Luxol fast blue staining. Results Unlike FA, entropy was less affected by axonal orientation and more affected by axonal density. A significant agreement (r = 0.91) was detected between entropy values from in vivo human brain and histologically measured axonal density from post mortum from the same brain structures. The MSC treated TBI rat demonstrated that the entropy approach is superior to FA in detecting axonal remodeling after injury. Compared with FA, entropy detected new axonal remodeling regions with crossing axons, confirmed with immunohistological staining. Conclusions Entropy measurement is more effective in distinguishing axonal remodeling after injury, when compared with FA. Entropy is also more sensitive to axonal density than axonal orientation, and thus may provide a more accurate reflection of axonal changes that occur in neurological injury and disease. PMID:24143186

  19. Characterizing brain structures and remodeling after TBI based on information content, diffusion entropy.

    PubMed

    Fozouni, Niloufar; Chopp, Michael; Nejad-Davarani, Siamak P; Zhang, Zheng Gang; Lehman, Norman L; Gu, Steven; Ueno, Yuji; Lu, Mei; Ding, Guangliang; Li, Lian; Hu, Jiani; Bagher-Ebadian, Hassan; Hearshen, David; Jiang, Quan

    2013-01-01

    To overcome the limitations of conventional diffusion tensor magnetic resonance imaging resulting from the assumption of a Gaussian diffusion model for characterizing voxels containing multiple axonal orientations, Shannon's entropy was employed to evaluate white matter structure in human brain and in brain remodeling after traumatic brain injury (TBI) in a rat. Thirteen healthy subjects were investigated using a Q-ball based DTI data sampling scheme. FA and entropy values were measured in white matter bundles, white matter fiber crossing areas, different gray matter (GM) regions and cerebrospinal fluid (CSF). Axonal densities' from the same regions of interest (ROIs) were evaluated in Bielschowsky and Luxol fast blue stained autopsy (n = 30) brain sections by light microscopy. As a case demonstration, a Wistar rat subjected to TBI and treated with bone marrow stromal cells (MSC) 1 week after TBI was employed to illustrate the superior ability of entropy over FA in detecting reorganized crossing axonal bundles as confirmed by histological analysis with Bielschowsky and Luxol fast blue staining. Unlike FA, entropy was less affected by axonal orientation and more affected by axonal density. A significant agreement (r = 0.91) was detected between entropy values from in vivo human brain and histologically measured axonal density from post mortum from the same brain structures. The MSC treated TBI rat demonstrated that the entropy approach is superior to FA in detecting axonal remodeling after injury. Compared with FA, entropy detected new axonal remodeling regions with crossing axons, confirmed with immunohistological staining. Entropy measurement is more effective in distinguishing axonal remodeling after injury, when compared with FA. Entropy is also more sensitive to axonal density than axonal orientation, and thus may provide a more accurate reflection of axonal changes that occur in neurological injury and disease.

  20. Spectral Entropy Can Predict Changes of Working Memory Performance Reduced by Short-Time Training in the Delayed-Match-to-Sample Task

    PubMed Central

    Tian, Yin; Zhang, Huiling; Xu, Wei; Zhang, Haiyong; Yang, Li; Zheng, Shuxing; Shi, Yupan

    2017-01-01

    Spectral entropy, which was generated by applying the Shannon entropy concept to the power distribution of the Fourier-transformed electroencephalograph (EEG), was utilized to measure the uniformity of power spectral density underlying EEG when subjects performed the working memory tasks twice, i.e., before and after training. According to Signed Residual Time (SRT) scores based on response speed and accuracy trade-off, 20 subjects were divided into two groups, namely high-performance and low-performance groups, to undertake working memory (WM) tasks. We found that spectral entropy derived from the retention period of WM on channel FC4 exhibited a high correlation with SRT scores. To this end, spectral entropy was used in support vector machine classifier with linear kernel to differentiate these two groups. Receiver operating characteristics analysis and leave-one out cross-validation (LOOCV) demonstrated that the averaged classification accuracy (CA) was 90.0 and 92.5% for intra-session and inter-session, respectively, indicating that spectral entropy could be used to distinguish these two different WM performance groups successfully. Furthermore, the support vector regression prediction model with radial basis function kernel and the root-mean-square error of prediction revealed that spectral entropy could be utilized to predict SRT scores on individual WM performance. After testing the changes in SRT scores and spectral entropy for each subject by short-time training, we found that 16 in 20 subjects’ SRT scores were clearly promoted after training and 15 in 20 subjects’ SRT scores showed consistent changes with spectral entropy before and after training. The findings revealed that spectral entropy could be a promising indicator to predict individual’s WM changes by training and further provide a novel application about WM for brain–computer interfaces. PMID:28912701

  1. [Specific features in realization of the principle of minimum energy dissipation during individual development].

    PubMed

    Zotin, A A

    2012-01-01

    Realization of the principle of minimum energy dissipation (Prigogine's theorem) during individual development has been analyzed. This analysis has suggested the following reformulation of this principle for living objects: when environmental conditions are constant, the living system evolves to a current steady state in such a way that the difference between entropy production and entropy flow (psi(u) function) is positive and constantly decreases near the steady state, approaching zero. In turn, the current steady state tends to a final steady state in such a way that the difference between the specific entropy productions in an organism and its environment tends to be minimal. In general, individual development completely agrees with the law of entropy increase (second law of thermodynamics).

  2. Sort entropy-based for the analysis of EEG during anesthesia

    NASA Astrophysics Data System (ADS)

    Ma, Liang; Huang, Wei-Zhi

    2010-08-01

    The monitoring of anesthetic depth is an absolutely necessary procedure in the process of surgical operation. To judge and control the depth of anesthesia has become a clinical issue which should be resolved urgently. EEG collected wiil be processed by sort entrop in this paper. Signal response of the surface of the cerebral cortex is determined for different stages of patients in the course of anesthesia. EEG is simulated and analyzed through the fast algorithm of sort entropy. The results show that discipline of phasic changes for EEG is very detected accurately,and it has better noise immunity in detecting the EEG anaesthetized than approximate entropy. In conclusion,the computing of Sort entropy algorithm requires shorter time. It has high efficiency and strong anti-interference.

  3. Alterations of thalassemic erythrocytes detected by wavelet entropy

    NASA Astrophysics Data System (ADS)

    Korol, A. M.; Rasia, R. J.; Rosso, O. A.

    2007-02-01

    A quantitative analysis of erythrocytes deformation under shear stress (the viscoelastic properties) observed on healthy donors as well as thalassemic patients are made by means of the normalized total wavelet entropy (NTWS). The results suggest that NTWS quantifier could be useful for characterizing pathological disturbances for the sake of clinical treatment.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boroun, G. R., E-mail: grboroun@gmail.com, E-mail: boroun@razi.ac.ir; Zarrin, S.

    We derive a general scheme for the evolution of the nonsinglet structure function at the leadingorder (LO) and next-to-leading-order (NLO) by using the Laplace-transform technique. Results for the nonsinglet structure function are compared with MSTW2008, GRV, and CKMT parameterizations and also EMC experimental data in the LO and NLO analysis. The results are in good agreement with the experimental data and other parameterizations in the low- and large-x regions.

  5. Quantum Non-thermal Effect from Black Holes Surrounded by Quintessence

    NASA Astrophysics Data System (ADS)

    Gong, Tian-Xi; Wang, Yong-Jiu

    2009-11-01

    We present a short and direct derivation of Hawking radiation as a tunneling process across the horizon and compute the tunneling probability. Considering the self-gravitation and energy conservation, we use the Keskiy Vakkuri, Kraus, and Wilczek (KKW) analysis to compute the temperature and entropy of the black holes surrounded by quintessence and obtain the temperature and entropy are different from the Hawking temperature and the Bekenstein-Hawking entropy. The result we get can offer a possible mechanism to deal with the information loss paradox because the spectrum is not purely thermal.

  6. Time-series analysis of foreign exchange rates using time-dependent pattern entropy

    NASA Astrophysics Data System (ADS)

    Ishizaki, Ryuji; Inoue, Masayoshi

    2013-08-01

    Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in foreign exchange rates, in particular, the dollar-yen rate. The time-dependent pattern entropy of the dollar-yen rate was found to be high in the following periods: before and after the turning points of the yen from strong to weak or from weak to strong, and the period after the Lehman shock.

  7. New paradigm for task switching strategies while performing multiple tasks: entropy and symbolic dynamics analysis of voluntary patterns.

    PubMed

    Guastello, Stephen J; Gorin, Hillary; Huschen, Samuel; Peters, Natalie E; Fabisch, Megan; Poston, Kirsten

    2012-10-01

    It has become well established in laboratory experiments that switching tasks, perhaps due to interruptions at work, incur costs in response time to complete the next task. Conditions are also known that exaggerate or lessen the switching costs. Although switching costs can contribute to fatigue, task switching can also be an adaptive response to fatigue. The present study introduces a new research paradigm for studying the emergence of voluntary task switching regimes, self-organizing processes therein, and the possibly conflicting roles of switching costs and minimum entropy. Fifty-four undergraduates performed 7 different computer-based cognitive tasks producing sets of 49 responses under instructional conditions requiring task quotas or no quotas. The sequences of task choices were analyzed using orbital decomposition to extract pattern types and lengths, which were then classified and compared with regard to Shannon entropy, topological entropy, number of task switches involved, and overall performance. Results indicated that similar but different patterns were generated under the two instructional conditions, and better performance was associated with lower topological entropy. Both entropy metrics were associated with the amount of voluntary task switching. Future research should explore conditions affecting the trade-off between switching costs and entropy, levels of automaticity between task elements, and the role of voluntary switching regimes on fatigue.

  8. Resting state fMRI entropy probes complexity of brain activity in adults with ADHD.

    PubMed

    Sokunbi, Moses O; Fung, Wilson; Sawlani, Vijay; Choppin, Sabine; Linden, David E J; Thome, Johannes

    2013-12-30

    In patients with attention deficit hyperactivity disorder (ADHD), quantitative neuroimaging techniques have revealed abnormalities in various brain regions, including the frontal cortex, striatum, cerebellum, and occipital cortex. Nonlinear signal processing techniques such as sample entropy have been used to probe the regularity of brain magnetoencephalography signals in patients with ADHD. In the present study, we extend this technique to analyse the complex output patterns of the 4 dimensional resting state functional magnetic resonance imaging signals in adult patients with ADHD. After adjusting for the effect of age, we found whole brain entropy differences (P=0.002) between groups and negative correlation (r=-0.45) between symptom scores and mean whole brain entropy values, indicating lower complexity in patients. In the regional analysis, patients showed reduced entropy in frontal and occipital regions bilaterally and a significant negative correlation between the symptom scores and the entropy maps at a family-wise error corrected cluster level of P<0.05 (P=0.001, initial threshold). Our findings support the hypothesis of abnormal frontal-striatal-cerebellar circuits in ADHD and the suggestion that sample entropy is a useful tool in revealing abnormalities in the brain dynamics of patients with psychiatric disorders. © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    PubMed

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  10. Comparison of Ice Cloud Particle Sizes Retrieved from Satellite Data Derived from In Situ Measurements

    NASA Technical Reports Server (NTRS)

    Han, Qingyuan; Rossow, William B.; Chou, Joyce; Welch, Ronald M.

    1997-01-01

    Cloud microphysical parameterizations have attracted a great deal of attention in recent years due to their effect on cloud radiative properties and cloud-related hydrological processes in large-scale models. The parameterization of cirrus particle size has been demonstrated as an indispensable component in the climate feedback analysis. Therefore, global-scale, long-term observations of cirrus particle sizes are required both as a basis of and as a validation of parameterizations for climate models. While there is a global scale, long-term survey of water cloud droplet sizes (Han et al.), there is no comparable study for cirrus ice crystals. This study is an effort to supply such a data set.

  11. Sensitivity of single column model simulations of Arctic springtime clouds to different cloud cover and mixed phase cloud parameterizations

    NASA Astrophysics Data System (ADS)

    Zhang, Junhua; Lohmann, Ulrike

    2003-08-01

    The single column model of the Canadian Centre for Climate Modeling and Analysis (CCCma) climate model is used to simulate Arctic spring cloud properties observed during the Surface Heat Budget of the Arctic Ocean (SHEBA) experiment. The model is driven by the rawinsonde observations constrained European Center for Medium-Range Weather Forecasts (ECMWF) reanalysis data. Five cloud parameterizations, including three statistical and two explicit schemes, are compared and the sensitivity to mixed phase cloud parameterizations is studied. Using the original mixed phase cloud parameterization of the model, the statistical cloud schemes produce more cloud cover, cloud water, and precipitation than the explicit schemes and in general agree better with observations. The mixed phase cloud parameterization from ECMWF decreases the initial saturation specific humidity threshold of cloud formation. This improves the simulated cloud cover in the explicit schemes and reduces the difference between the different cloud schemes. On the other hand, because the ECMWF mixed phase cloud scheme does not consider the Bergeron-Findeisen process, less ice crystals are formed. This leads to a higher liquid water path and less precipitation than what was observed.

  12. Effective Atomic Number, Mass Attenuation Coefficient Parameterization, and Implications for High-Energy X-Ray Cargo Inspection Systems

    NASA Astrophysics Data System (ADS)

    Langeveld, Willem G. J.

    The most widely used technology for the non-intrusive active inspection of cargo containers and trucks is x-ray radiography at high energies (4-9 MeV). Technologies such as dual-energy imaging, spectroscopy, and statistical waveform analysis can be used to estimate the effective atomic number (Zeff) of the cargo from the x-ray transmission data, because the mass attenuation coefficient depends on energy as well as atomic number Z. The estimated effective atomic number, Zeff, of the cargo then leads to improved detection capability of contraband and threats, including special nuclear materials (SNM) and shielding. In this context, the exact meaning of effective atomic number (for mixtures and compounds) is generally not well-defined. Physics-based parameterizations of the mass attenuation coefficient have been given in the past, but usually for a limited low-energy range. Definitions of Zeff have been based, in part, on such parameterizations. Here, we give an improved parameterization at low energies (20-1000 keV) which leads to a well-defined Zeff. We then extend this parameterization up to energies relevant for cargo inspection (10 MeV), and examine what happens to the Zeff definition at these higher energies.

  13. 2D Affine and Projective Shape Analysis.

    PubMed

    Bryner, Darshan; Klassen, Eric; Huiling Le; Srivastava, Anuj

    2014-05-01

    Current techniques for shape analysis tend to seek invariance to similarity transformations (rotation, translation, and scale), but certain imaging situations require invariance to larger groups, such as affine or projective groups. Here we present a general Riemannian framework for shape analysis of planar objects where metrics and related quantities are invariant to affine and projective groups. Highlighting two possibilities for representing object boundaries-ordered points (or landmarks) and parameterized curves-we study different combinations of these representations (points and curves) and transformations (affine and projective). Specifically, we provide solutions to three out of four situations and develop algorithms for computing geodesics and intrinsic sample statistics, leading up to Gaussian-type statistical models, and classifying test shapes using such models learned from training data. In the case of parameterized curves, we also achieve the desired goal of invariance to re-parameterizations. The geodesics are constructed by particularizing the path-straightening algorithm to geometries of current manifolds and are used, in turn, to compute shape statistics and Gaussian-type shape models. We demonstrate these ideas using a number of examples from shape and activity recognition.

  14. An investigation of combustion and entropy noise

    NASA Technical Reports Server (NTRS)

    Strahle, W. C.

    1977-01-01

    The relative importance of entropy and direct combustion noise in turbopropulsion systems and the parameters upon which these noise sources depend were studied. Theory and experiment were employed to determine that at least with the apparatus used here, entropy noise can dominate combustion noise if there is a sufficient pressure gradient terminating the combustor. Measurements included combustor interior fluctuating pressure, near and far field fluctuating pressure, and combustor exit plane fluctuating temperatures, as well as mean pressures and temperatures. Analysis techniques included spectral, cross-correlation, cross power spectra, and ordinary and partial coherence analysis. Also conducted were combustor liner modification experiments to investigate the origin of the frequency content of combustion noise. Techniques were developed to extract nonpropagational pseudo-sound and the heat release fluctuation spectra from the data.

  15. Molecular Dynamics Simulation of Surface Tension of NaCl Aqueous Solution at 298.15K: from Diluted to Highly Supersaturated Concentrations

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoxiang; Chen, Chuchu; Poeschl, Ulirch; Su, Hang; Cheng, Yafang

    2017-04-01

    Sodium chloride (NaCl) is one of the key components of atmospheric aerosol particles. Concentration-depend surface tension of aqueous NaCl solution is essential to determine the equilibrium between droplet NaCl solution and water vapor, which is important in regards to aerosol-cloud interaction and aerosol climate effects. Although supersaturated NaCl droplets can be widely found under atmospheric conditions, the experimental determined concentration dependency of surface tension is limited up to the saturated concentration range due to technical difficulties, i.e., heterogeneous nucleation since nearly all surface tension measurement techniques requires contact of the sensor and solution surface. In this study, the surface tension of NaCl aqueous solution with solute mass fraction from 0 to 1 was calculated using molecular dynamics (MD) simulation. The surface tension increases monotonically and near linearly when mass fraction of NaCl (xNaCl) is lower than 0.265 (saturation point), which follows theoretical predictions (e.g., E-AIM, SP parameterization, and PK parameterization). Once entering into the supersaturated concentration range, the calculated surface tension starts to deviate from the near-linear extrapolation and adopts a slightly higher increasing rate until xNaCl of 0.35. We found that these two increasing phases (xNaCl 0.35) is mainly driven by the increase of excessive surface enthalpy when the solution becomes concentrated. After that, the surface tension remains almost unchanged until xNaCl of 0.52. This phenomenon is supported by the results from experiment based Differential Koehler Analyses. The stable surface tension in this concentration range is attributed to a simultaneous change of surface excess enthalpy and entropy at similar degree. When the NaCl solution is getting more concentrated than xNaCl of 0.52, the simulated surface tension regains an even faster growing momentum and shows the tendency of ultimately approaching the surface tension of molten NaCl at 298.15 K ( 148.4 mN/m by MD simulation). Energetic analyses imply that this fast increase is primarily still an excessive surface enthalpy-driven process, although concurrent fluctuation of excessive surface entropy is also expected but in a much smaller scale. Our results unfold the global landscape of concentration dependence of aqueous NaCl solution and its driven forces: a water surface tension dominated regime (xNaCl from 0 to 0.35), a transition regime (xNaCl from 0.35 to 0.52) and a molten NaCl surface tension dominated regime (xNaCl beyond 0.52).

  16. Extending Differential Fault Analysis to Dynamic S-Box Advanced Encryption Standard Implementations

    DTIC Science & Technology

    2014-09-18

    entropy . At the same time, researchers strive to enhance AES and mitigate these growing threats. This paper researches the extension of existing...the algorithm or use side channels to reduce entropy , such as Differential Fault Analysis (DFA). At the same time, continuing research strives to...the state matrix. The S-box is an 8-bit 16x16 table built from an affine transformation on multiplicative inverses which guarantees full permutation (S

  17. Evaluation of Extratropical Cyclone Precipitation in the North Atlantic Basin: An analysis of ERA-Interim, WRF, and two CMIP5 models.

    PubMed

    Booth, James F; Naud, Catherine M; Willison, Jeff

    2018-03-01

    The representation of extratropical cyclones (ETCs) precipitation in general circulation models (GCMs) and a weather research and forecasting (WRF) model is analyzed. This work considers the link between ETC precipitation and dynamical strength and tests if parameterized convection affects this link for ETCs in the North Atlantic Basin. Lagrangian cyclone tracks of ETCs in ERA-Interim reanalysis (ERAI), the GISS and GFDL CMIP5 models, and WRF with two horizontal resolutions are utilized in a compositing analysis. The 20-km resolution WRF model generates stronger ETCs based on surface wind speed and cyclone precipitation. The GCMs and ERAI generate similar composite means and distributions for cyclone precipitation rates, but GCMs generate weaker cyclone surface winds than ERAI. The amount of cyclone precipitation generated by the convection scheme differs significantly across the datasets, with GISS generating the most, followed by ERAI and then GFDL. The models and reanalysis generate relatively more parameterized convective precipitation when the total cyclone-averaged precipitation is smaller. This is partially due to the contribution of parameterized convective precipitation occurring more often late in the ETC life cycle. For reanalysis and models, precipitation increases with both cyclone moisture and surface wind speed, and this is true if the contribution from the parameterized convection scheme is larger or not. This work shows that these different models generate similar total ETC precipitation despite large differences in the parameterized convection, and these differences do not cause unexpected behavior in ETC precipitation sensitivity to cyclone moisture or surface wind speed.

  18. Signatures of Solvation Thermodynamics in Spectra of Intermolecular Vibrations

    PubMed Central

    2017-01-01

    This study explores the thermodynamic and vibrational properties of water in the three-dimensional environment of solvated ions and small molecules using molecular simulations. The spectrum of intermolecular vibrations in liquid solvents provides detailed information on the shape of the local potential energy surface, which in turn determines local thermodynamic properties such as the entropy. Here, we extract this information using a spatially resolved extension of the two-phase thermodynamics method to estimate hydration water entropies based on the local vibrational density of states (3D-2PT). Combined with an analysis of solute–water and water–water interaction energies, this allows us to resolve local contributions to the solvation enthalpy, entropy, and free energy. We use this approach to study effects of ions on their surrounding water hydrogen bond network, its spectrum of intermolecular vibrations, and resulting thermodynamic properties. In the three-dimensional environment of polar and nonpolar functional groups of molecular solutes, we identify distinct hydration water species and classify them by their characteristic vibrational density of states and molecular entropies. In each case, we are able to assign variations in local hydration water entropies to specific changes in the spectrum of intermolecular vibrations. This provides an important link for the thermodynamic interpretation of vibrational spectra that are accessible to far-infrared absorption and Raman spectroscopy experiments. Our analysis provides unique microscopic details regarding the hydration of hydrophobic and hydrophilic functional groups, which enable us to identify interactions and molecular degrees of freedom that determine relevant contributions to the solvation entropy and consequently the free energy. PMID:28783431

  19. From Random Walks to Brownian Motion, from Diffusion to Entropy: Statistical Principles in Introductory Physics

    NASA Astrophysics Data System (ADS)

    Reeves, Mark

    2014-03-01

    Entropy changes underlie the physics that dominates biological interactions. Indeed, introductory biology courses often begin with an exploration of the qualities of water that are important to living systems. However, one idea that is not explicitly addressed in most introductory physics or biology textbooks is dominant contribution of the entropy in driving important biological processes towards equilibrium. From diffusion to cell-membrane formation, to electrostatic binding in protein folding, to the functioning of nerve cells, entropic effects often act to counterbalance deterministic forces such as electrostatic attraction and in so doing, allow for effective molecular signaling. A small group of biology, biophysics and computer science faculty have worked together for the past five years to develop curricular modules (based on SCALEUP pedagogy) that enable students to create models of stochastic and deterministic processes. Our students are first-year engineering and science students in the calculus-based physics course and they are not expected to know biology beyond the high-school level. In our class, they learn to reduce seemingly complex biological processes and structures to be described by tractable models that include deterministic processes and simple probabilistic inference. The students test these models in simulations and in laboratory experiments that are biologically relevant. The students are challenged to bridge the gap between statistical parameterization of their data (mean and standard deviation) and simple model-building by inference. This allows the students to quantitatively describe realistic cellular processes such as diffusion, ionic transport, and ligand-receptor binding. Moreover, the students confront ``random'' forces and traditional forces in problems, simulations, and in laboratory exploration throughout the year-long course as they move from traditional kinematics through thermodynamics to electrostatic interactions. This talk will present a number of these exercises, with particular focus on the hands-on experiments done by the students, and will give examples of the tangible material that our students work with throughout the two-semester sequence of their course on introductory physics with a bio focus. Supported by NSF DUE.

  20. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    PubMed

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  1. On the Relationship between Observed NLDN Lightning ...

    EPA Pesticide Factsheets

    Lightning-produced nitrogen oxides (NOX=NO+NO2) in the middle and upper troposphere play an essential role in the production of ozone (O3) and influence the oxidizing capacity of the troposphere. Despite much effort in both observing and modeling lightning NOX during the past decade, considerable uncertainties still exist with the quantification of lightning NOX production and distribution in the troposphere. It is even more challenging for regional chemistry and transport models to accurately parameterize lightning NOX production and distribution in time and space. The Community Multiscale Air Quality Model (CMAQ) parameterizes the lightning NO emissions using local scaling factors adjusted by the convective precipitation rate that is predicted by the upstream meteorological model; the adjustment is based on the observed lightning strikes from the National Lightning Detection Network (NLDN). For this parameterization to be valid, the existence of an a priori reasonable relationship between the observed lightning strikes and the modeled convective precipitation rates is needed. In this study, we will present an analysis leveraged on the observed NLDN lightning strikes and CMAQ model simulations over the continental United States for a time period spanning over a decade. Based on the analysis, new parameterization scheme for lightning NOX will be proposed and the results will be evaluated. The proposed scheme will be beneficial to modeling exercises where the obs

  2. Regional Sustainable Development Analysis Based on Information Entropy-Sichuan Province as an Example.

    PubMed

    Liang, Xuedong; Si, Dongyang; Zhang, Xinli

    2017-10-13

    According to the implementation of a scientific development perspective, sustainable development needs to consider regional development, economic and social development, and the harmonious development of society and nature, but regional sustainable development is often difficult to quantify. Through an analysis of the structure and functions of a regional system, this paper establishes an evaluation index system, which includes an economic subsystem, an ecological environmental subsystem and a social subsystem, to study regional sustainable development capacity. A sustainable development capacity measure model for Sichuan Province was established by applying the information entropy calculation principle and the Brusselator principle. Each subsystem and entropy change in a calendar year in Sichuan Province were analyzed to evaluate Sichuan Province's sustainable development capacity. It was found that the established model could effectively show actual changes in sustainable development levels through the entropy change reaction system, at the same time this model could clearly demonstrate how those forty-six indicators from the three subsystems impact on the regional sustainable development, which could make up for the lack of sustainable development research.

  3. Deep learning for classification of islanding and grid disturbance based on multi-resolution singular spectrum entropy

    NASA Astrophysics Data System (ADS)

    Li, Tie; He, Xiaoyang; Tang, Junci; Zeng, Hui; Zhou, Chunying; Zhang, Nan; Liu, Hui; Lu, Zhuoxin; Kong, Xiangrui; Yan, Zheng

    2018-02-01

    Forasmuch as the distinguishment of islanding is easy to be interfered by grid disturbance, island detection device may make misjudgment thus causing the consequence of photovoltaic out of service. The detection device must provide with the ability to differ islanding from grid disturbance. In this paper, the concept of deep learning is introduced into classification of islanding and grid disturbance for the first time. A novel deep learning framework is proposed to detect and classify islanding or grid disturbance. The framework is a hybrid of wavelet transformation, multi-resolution singular spectrum entropy, and deep learning architecture. As a signal processing method after wavelet transformation, multi-resolution singular spectrum entropy combines multi-resolution analysis and spectrum analysis with entropy as output, from which we can extract the intrinsic different features between islanding and grid disturbance. With the features extracted, deep learning is utilized to classify islanding and grid disturbance. Simulation results indicate that the method can achieve its goal while being highly accurate, so the photovoltaic system mistakenly withdrawing from power grids can be avoided.

  4. Investigation of phase stability of novel equiatomic FeCoNiCuZn based-high entropy alloy prepared by mechanical alloying

    NASA Astrophysics Data System (ADS)

    Soni, Vinay Kumar; Sanyal, S.; Sinha, S. K.

    2018-05-01

    The present work reports the structural and phase stability analysis of equiatomic FeCoNiCuZn High entropy alloy (HEA) systems prepared by mechanical alloying (MA) method. In this research effort some 1287 alloy combinations were extensively studied to arrive at most favourable combination. FeCoNiCuZn based alloy system was selected on the basis of physiochemical parameters such as enthalpy of mixing (ΔHmix), entropy of mixing (ΔSmix), atomic size difference (ΔX) and valence electron concentration (VEC) such that it fulfils the formation criteria of stable multi component high entropy alloy system. In this context, we have investigated the effect of novel alloying addition in view of microstructure and phase formation aspect. XRD plots of the MA samples shows the formation of stable solid solution with FCC (Face Cantered Cubic) after 20 hr of milling time and no indication of any amorphous or intermetallic phase formation. Our results are in good agreement with calculation and analysis done on the basis of physiochemical parameters during selection of constituent elements of HEA.

  5. Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.

    PubMed

    Mammone, Nadia; Morabito, Francesco Carlo

    2008-09-01

    Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.

  6. Numerical study focusing on the entropy analysis of MHD squeezing flow of a nanofluid model using Cattaneo–Christov theory

    NASA Astrophysics Data System (ADS)

    Akmal, N.; Sagheer, M.; Hussain, S.

    2018-05-01

    The present study gives an account of the heat transfer characteristics of the squeezing flow of a nanofluid between two flat plates with upper plate moving vertically and the lower in the horizontal direction. Tiwari and Das nanofluid model has been utilized to give a comparative analysis of the heat transfer in the Cu-water and Al2O3-water nanofluids with entropy generation. The modeling is carried out with the consideration of Lorentz forces to observe the effect of magnetic field on the flow. The Joule heating effect is included to discuss the heat dissipation in the fluid and its effect on the entropy of the system. The nondimensional ordinary differential equations are solved using the Keller box method to assess the numerical results which are presented by the graphs and tables. An interesting observation is that the entropy is generated more near the lower plate as compared with that at the upper plate. Also, the heat transfer rate is found to be higher for the Cu nanoparticles in comparison with the Al2O3 nanoparticles.

  7. On the limits of probabilistic forecasting in nonlinear time series analysis II: Differential entropy.

    PubMed

    Amigó, José M; Hirata, Yoshito; Aihara, Kazuyuki

    2017-08-01

    In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.

  8. Towards an information geometric characterization/classification of complex systems. I. Use of generalized entropies

    NASA Astrophysics Data System (ADS)

    Ghikas, Demetris P. K.; Oikonomou, Fotios D.

    2018-04-01

    Using the generalized entropies which depend on two parameters we propose a set of quantitative characteristics derived from the Information Geometry based on these entropies. Our aim, at this stage, is to construct first some fundamental geometric objects which will be used in the development of our geometrical framework. We first establish the existence of a two-parameter family of probability distributions. Then using this family we derive the associated metric and we state a generalized Cramer-Rao Inequality. This gives a first two-parameter classification of complex systems. Finally computing the scalar curvature of the information manifold we obtain a further discrimination of the corresponding classes. Our analysis is based on the two-parameter family of generalized entropies of Hanel and Thurner (2011).

  9. Path length entropy analysis of diastolic heart sounds.

    PubMed

    Griffel, Benjamin; Zia, Mohammad K; Fridman, Vladamir; Saponieri, Cesare; Semmlow, John L

    2013-09-01

    Early detection of coronary artery disease (CAD) using the acoustic approach, a noninvasive and cost-effective method, would greatly improve the outcome of CAD patients. To detect CAD, we analyze diastolic sounds for possible CAD murmurs. We observed diastolic sounds to exhibit 1/f structure and developed a new method, path length entropy (PLE) and a scaled version (SPLE), to characterize this structure to improve CAD detection. We compare SPLE results to Hurst exponent, Sample entropy and Multiscale entropy for distinguishing between normal and CAD patients. SPLE achieved a sensitivity-specificity of 80%-81%, the best of the tested methods. However, PLE and SPLE are not sufficient to prove nonlinearity, and evaluation using surrogate data suggests that our cardiovascular sound recordings do not contain significant nonlinear properties. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Path Length Entropy Analysis of Diastolic Heart Sounds

    PubMed Central

    Griffel, B.; Zia, M. K.; Fridman, V.; Saponieri, C.; Semmlow, J. L.

    2013-01-01

    Early detection of coronary artery disease (CAD) using the acoustic approach, a noninvasive and cost-effective method, would greatly improve the outcome of CAD patients. To detect CAD, we analyze diastolic sounds for possible CAD murmurs. We observed diastolic sounds to exhibit 1/f structure and developed a new method, path length entropy (PLE) and a scaled version (SPLE), to characterize this structure to improve CAD detection. We compare SPLE results to Hurst exponent, Sample entropy and Multi-scale entropy for distinguishing between normal and CAD patients. SPLE achieved a sensitivity-specificity of 80%–81%, the best of the tested methods. However, PLE and SPLE are not sufficient to prove nonlinearity, and evaluation using surrogate data suggests that our cardiovascular sound recordings do not contain significant nonlinear properties. PMID:23930808

  11. Analysis of the Influence of Complexity and Entropy of Odorant on Fractal Dynamics and Entropy of EEG Signal.

    PubMed

    Namazi, Hamidreza; Akrami, Amin; Nazeri, Sina; Kulish, Vladimir V

    2016-01-01

    An important challenge in brain research is to make out the relation between the features of olfactory stimuli and the electroencephalogram (EEG) signal. Yet, no one has discovered any relation between the structures of olfactory stimuli and the EEG signal. This study investigates the relation between the structures of EEG signal and the olfactory stimulus (odorant). We show that the complexity of the EEG signal is coupled with the molecular complexity of the odorant, where more structurally complex odorant causes less fractal EEG signal. Also, odorant having higher entropy causes the EEG signal to have lower approximate entropy. The method discussed here can be applied and investigated in case of patients with brain diseases as the rehabilitation purpose.

  12. Entanglement entropy and correlations in loop quantum gravity

    NASA Astrophysics Data System (ADS)

    Feller, Alexandre; Livine, Etera R.

    2018-02-01

    Black hole entropy is one of the few windows into the quantum aspects of gravitation, and its study over the years has highlighted the holographic nature of gravity. At the non-perturbative level in quantum gravity, promising explanations are being explored in terms of the entanglement entropy between regions of space. In the context of loop quantum gravity, this translates into an analysis of the correlations between the regions of the spin network states defining the quantum state of the geometry of space. In this paper, we explore a class of states, motivated by results in condensed matter physics, satisfying an area law for entanglement entropy and having non-trivial correlations. We highlight that entanglement comes from holonomy operators acting on loops crossing the boundary of the region.

  13. An application of sample entropy to precipitation in Paraíba State, Brazil

    NASA Astrophysics Data System (ADS)

    Xavier, Sílvio Fernando Alves; da Silva Jale, Jader; Stosic, Tatijana; dos Santos, Carlos Antonio Costa; Singh, Vijay P.

    2018-05-01

    A climate system is characterized to be a complex non-linear system. In order to describe the complex characteristics of precipitation series in Paraíba State, Brazil, we aim the use of sample entropy, a kind of entropy-based algorithm, to evaluate the complexity of precipitation series. Sixty-nine meteorological stations are distributed over four macroregions: Zona da Mata, Agreste, Borborema, and Sertão. The results of the analysis show that intricacies of monthly average precipitation have differences in the macroregions. Sample entropy is able to reflect the dynamic change of precipitation series providing a new way to investigate complexity of hydrological series. The complexity exhibits areal variation of local water resource systems which can influence the basis for utilizing and developing resources in dry areas.

  14. Analysis of the Influence of Complexity and Entropy of Odorant on Fractal Dynamics and Entropy of EEG Signal

    PubMed Central

    Akrami, Amin; Nazeri, Sina

    2016-01-01

    An important challenge in brain research is to make out the relation between the features of olfactory stimuli and the electroencephalogram (EEG) signal. Yet, no one has discovered any relation between the structures of olfactory stimuli and the EEG signal. This study investigates the relation between the structures of EEG signal and the olfactory stimulus (odorant). We show that the complexity of the EEG signal is coupled with the molecular complexity of the odorant, where more structurally complex odorant causes less fractal EEG signal. Also, odorant having higher entropy causes the EEG signal to have lower approximate entropy. The method discussed here can be applied and investigated in case of patients with brain diseases as the rehabilitation purpose. PMID:27699169

  15. Evaluation of spectral entropy to measure anaesthetic depth and antinociception in sevoflurane-anaesthetised Beagle dogs.

    PubMed

    Morgaz, Juan; Granados, María del Mar; Domínguez, Juan Manuel; Navarrete, Rocío; Fernández, Andrés; Galán, Alba; Muñoz, Pilar; Gómez-Villamandos, Rafael J

    2011-06-01

    The use of spectral entropy to determine anaesthetic depth and antinociception was evaluated in sevoflurane-anaesthetised Beagle dogs. Dogs were anaesthetised at each of five multiples of their individual minimum alveolar concentrations (MAC; 0.75, 1, 1.25, 1.5 and 1.75 MAC), and response entropy (RE), state entropy (SE), RE-SE difference, burst suppression rate (BSR) and cardiorespiratory parameters were recorded before and after a painful stimulus. RE, SE and RE-SE difference did not change significantly after the stimuli. The correlation between MAC-entropy parameters was weak, but these values increased when 1.75 MAC results were excluded from the analysis. BSR was different to zero at 1.5 and 1.75 MAC. It was concluded that RE and RE-SE differences were not adequate indicators of antinociception and SE and RE were unable to detect deep planes of anaesthesia in dogs, although they both distinguished the awake and unconscious states. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Quantitative EEG analysis of the maturational changes associated with childhood absence epilepsy

    NASA Astrophysics Data System (ADS)

    Rosso, O. A.; Hyslop, W.; Gerlach, R.; Smith, R. L. L.; Rostas, J. A. P.; Hunter, M.

    2005-10-01

    This study aimed to examine the background electroencephalography (EEG) in children with childhood absence epilepsy, a condition whose presentation has strong developmental links. EEG hallmarks of absence seizure activity are widely accepted and there is recognition that the bulk of inter-ictal EEG in this group is normal to the naked eye. This multidisciplinary study aimed to use the normalized total wavelet entropy (NTWS) (Signal Processing 83 (2003) 1275) to examine the background EEG of those patients demonstrating absence seizure activity, and compare it with children without absence epilepsy. This calculation can be used to define the degree of order in a system, with higher levels of entropy indicating a more disordered (chaotic) system. Results were subjected to further statistical analyses of significance. Entropy values were calculated for patients versus controls. For all channels combined, patients with absence epilepsy showed (statistically significant) lower entropy values than controls. The size of the difference in entropy values was not uniform, with certain EEG electrodes consistently showing greater differences than others.

  17. A Discrete Constraint for Entropy Conservation and Sound Waves in Cloud-Resolving Modeling

    NASA Technical Reports Server (NTRS)

    Zeng, Xi-Ping; Tao, Wei-Kuo; Simpson, Joanne

    2003-01-01

    Ideal cloud-resolving models contain little-accumulative errors. When their domain is so large that synoptic large-scale circulations are accommodated, they can be used for the simulation of the interaction between convective clouds and the large-scale circulations. This paper sets up a framework for the models, using moist entropy as a prognostic variable and employing conservative numerical schemes. The models possess no accumulative errors of thermodynamic variables when they comply with a discrete constraint on entropy conservation and sound waves. Alternatively speaking, the discrete constraint is related to the correct representation of the large-scale convergence and advection of moist entropy. Since air density is involved in entropy conservation and sound waves, the challenge is how to compute sound waves efficiently under the constraint. To address the challenge, a compensation method is introduced on the basis of a reference isothermal atmosphere whose governing equations are solved analytically. Stability analysis and numerical experiments show that the method allows the models to integrate efficiently with a large time step.

  18. Toward Improved Parameterization of a Meso-Scale Hydrologic Model in a Discontinuous Permafrost, Boreal Forest Ecosystem

    NASA Astrophysics Data System (ADS)

    Endalamaw, A. M.; Bolton, W. R.; Young, J. M.; Morton, D.; Hinzman, L. D.

    2013-12-01

    The sub-arctic environment can be characterized as being located in the zone of discontinuous permafrost. Although the distribution of permafrost is site specific, it dominates many of the hydrologic and ecologic responses and functions including vegetation distribution, stream flow, soil moisture, and storage processes. In this region, the boundaries that separate the major ecosystem types (deciduous dominated and coniferous dominated ecosystems) as well as permafrost (permafrost verses non-permafrost) occur over very short spatial scales. One of the goals of this research project is to improve parameterizations of meso-scale hydrologic models in this environment. Using the Caribou-Poker Creeks Research Watershed (CPCRW) as the test area, simulations of the headwater catchments of varying permafrost and vegetation distributions were performed. CPCRW, located approximately 50 km northeast of Fairbanks, Alaska, is located within the zone of discontinuous permafrost and the boreal forest ecosystem. The Variable Infiltration Capacity (VIC) model was selected as the hydrologic model. In CPCRW, permafrost and coniferous vegetation is generally found on north facing slopes and valley bottoms. Permafrost free soils and deciduous vegetation is generally found on south facing slopes. In this study, hydrologic simulations using fine scale vegetation and soil parameterizations - based upon slope and aspect analysis at a 50 meter resolution - were conducted. Simulations were also conducted using downscaled vegetation from the Scenarios Network for Alaska and Arctic Planning (SNAP) (1 km resolution) and soil data sets from the Food and Agriculture Organization (FAO) (approximately 9 km resolution). Preliminary simulation results show that soil and vegetation parameterizations based upon fine scale slope/aspect analysis increases the R2 values (0.5 to 0.65 in the high permafrost (53%) basin; 0.43 to 0.56 in the low permafrost (2%) basin) relative to parameterization based on coarse scale data. These results suggest that using fine resolution parameterizations can be used to improve meso-scale hydrological modeling in this region.

  19. The effect of dexmedetomidine continuous infusion as an adjuvant to general anesthesia on sevoflurane requirements: A study based on entropy analysis.

    PubMed

    Patel, Chirag Ramanlal; Engineer, Smita R; Shah, Bharat J; Madhu, S

    2013-07-01

    Dexmedetomidine, a α2 agonist as an adjuvant in general anesthesia, has anesthetic and analgesic-sparing property. To evaluate the effect of continuous infusion of dexmedetomidine alone, without use of opioids, on requirement of sevoflurane during general anesthesia with continuous monitoring of depth of anesthesia by entropy analysis. Sixty patients were randomly divided into 2 groups of 30 each. In group A, fentanyl 2 mcg/kg was given while in group B, dexmedetomidine was given intravenously as loading dose of 1 mcg/kg over 10 min prior to induction. After induction with thiopentone in group B, dexmedetomidine was given as infusion at a dose of 0.2-0.8 mcg/kg. Sevoflurane was used as inhalation agent in both groups. Hemodynamic variables, sevoflurane inspired fraction (FIsevo), sevoflurane expired fraction (ETsevo), and entropy (Response entropy and state entropy) were continuously recorded. Statistical analysis was done by unpaired student's t-test and Chi-square test for continuous and categorical variables, respectively. A P-value < 0.05 was considered significant. The use of dexmedetomidine with sevoflurane was associated with a statistical significant decrease in ETsevo at 5 minutes post-intubation (1.49 ± 0.11) and 60 minutes post-intubation (1.11 ±0.28) as compared to the group A [1.73 ±0.30 (5 minutes); 1.68 ±0.50 (60 minutes)]. There was an average 21.5% decrease in ETsevo in group B as compared to group A. Dexmedetomidine, as an adjuvant in general anesthesia, decreases requirement of sevoflurane for maintaining adequate depth of anesthesia.

  20. Ovarian Cancer Differential Interactome and Network Entropy Analysis Reveal New Candidate Biomarkers.

    PubMed

    Ayyildiz, Dilara; Gov, Esra; Sinha, Raghu; Arga, Kazim Yalcin

    2017-05-01

    Ovarian cancer is one of the most common cancers and has a high mortality rate due to insidious symptoms and lack of robust diagnostics. A hitherto understudied concept in cancer pathogenesis may offer new avenues for innovation in ovarian cancer biomarker development. Cancer cells are characterized by an increase in network entropy, and several studies have exploited this concept to identify disease-associated gene and protein modules. We report in this study the changes in protein-protein interactions (PPIs) in ovarian cancer within a differential network (interactome) analysis framework utilizing the entropy concept and gene expression data. A compendium of six transcriptome datasets that included 140 samples from laser microdissected epithelial cells of ovarian cancer patients and 51 samples from healthy population was obtained from Gene Expression Omnibus, and the high confidence human protein interactome (31,465 interactions among 10,681 proteins) was used. The uncertainties of the up- or downregulation of PPIs in ovarian cancer were estimated through an entropy formulation utilizing combined expression levels of genes, and the interacting protein pairs with minimum uncertainty were identified. We identified 105 proteins with differential PPI patterns scattered in 11 modules, each indicating significantly affected biological pathways in ovarian cancer such as DNA repair, cell proliferation-related mechanisms, nucleoplasmic translocation of estrogen receptor, extracellular matrix degradation, and inflammation response. In conclusion, we suggest several PPIs as biomarker candidates for ovarian cancer and discuss their future biological implications as potential molecular targets for pharmaceutical development as well. In addition, network entropy analysis is a concept that deserves greater research attention for diagnostic innovation in oncology and tumor pathogenesis.

  1. A duality principle for the multi-block entanglement entropy of free fermion systems.

    PubMed

    Carrasco, J A; Finkel, F; González-López, A; Tempesta, P

    2017-09-11

    The analysis of the entanglement entropy of a subsystem of a one-dimensional quantum system is a powerful tool for unravelling its critical nature. For instance, the scaling behaviour of the entanglement entropy determines the central charge of the associated Virasoro algebra. For a free fermion system, the entanglement entropy depends essentially on two sets, namely the set A of sites of the subsystem considered and the set K of excited momentum modes. In this work we make use of a general duality principle establishing the invariance of the entanglement entropy under exchange of the sets A and K to tackle complex problems by studying their dual counterparts. The duality principle is also a key ingredient in the formulation of a novel conjecture for the asymptotic behavior of the entanglement entropy of a free fermion system in the general case in which both sets A and K consist of an arbitrary number of blocks. We have verified that this conjecture reproduces the numerical results with excellent precision for all the configurations analyzed. We have also applied the conjecture to deduce several asymptotic formulas for the mutual and r-partite information generalizing the known ones for the single block case.

  2. Power-law scaling for macroscopic entropy and microscopic complexity: Evidence from human movement and posture

    NASA Astrophysics Data System (ADS)

    Hong, S. Lee; Bodfish, James W.; Newell, Karl M.

    2006-03-01

    We investigated the relationship between macroscopic entropy and microscopic complexity of the dynamics of body rocking and sitting still across adults with stereotyped movement disorder and mental retardation (profound and severe) against controls matched for age, height, and weight. This analysis was performed through the examination of center of pressure (COP) motion on the mediolateral (side-to-side) and anteroposterior (fore-aft) dimensions and the entropy of the relative phase between the two dimensions of motion. Intentional body rocking and stereotypical body rocking possessed similar slopes for their respective frequency spectra, but differences were revealed during maintenance of sitting postures. The dynamics of sitting in the control group produced lower spectral slopes and higher complexity (approximate entropy). In the controls, the higher complexity found on each dimension of motion was related to a weaker coupling between dimensions. Information entropy of the relative phase between the two dimensions of COP motion and irregularity (complexity) of their respective motions fitted a power-law function, revealing a relationship between macroscopic entropy and microscopic complexity across both groups and behaviors. This power-law relation affords the postulation that the organization of movement and posture dynamics occurs as a fractal process.

  3. Loss of conformational entropy in protein folding calculated using realistic ensembles and its implications for NMR-based calculations

    PubMed Central

    Baxa, Michael C.; Haddadian, Esmael J.; Jumper, John M.; Freed, Karl F.; Sosnick, Tobin R.

    2014-01-01

    The loss of conformational entropy is a major contribution in the thermodynamics of protein folding. However, accurate determination of the quantity has proven challenging. We calculate this loss using molecular dynamic simulations of both the native protein and a realistic denatured state ensemble. For ubiquitin, the total change in entropy is TΔSTotal = 1.4 kcal⋅mol−1 per residue at 300 K with only 20% from the loss of side-chain entropy. Our analysis exhibits mixed agreement with prior studies because of the use of more accurate ensembles and contributions from correlated motions. Buried side chains lose only a factor of 1.4 in the number of conformations available per rotamer upon folding (ΩU/ΩN). The entropy loss for helical and sheet residues differs due to the smaller motions of helical residues (TΔShelix−sheet = 0.5 kcal⋅mol−1), a property not fully reflected in the amide N-H and carbonyl C=O bond NMR order parameters. The results have implications for the thermodynamics of folding and binding, including estimates of solvent ordering and microscopic entropies obtained from NMR. PMID:25313044

  4. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  5. Comparison of Different Features and Classifiers for Driver Fatigue Detection Based on a Single EEG Channel

    PubMed Central

    2017-01-01

    Driver fatigue has become an important factor to traffic accidents worldwide, and effective detection of driver fatigue has major significance for public health. The purpose method employs entropy measures for feature extraction from a single electroencephalogram (EEG) channel. Four types of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE), and spectral entropy (PE), were deployed for the analysis of original EEG signal and compared by ten state-of-the-art classifiers. Results indicate that optimal performance of single channel is achieved using a combination of channel CP4, feature FE, and classifier Random Forest (RF). The highest accuracy can be up to 96.6%, which has been able to meet the needs of real applications. The best combination of channel + features + classifier is subject-specific. In this work, the accuracy of FE as the feature is far greater than the Acc of other features. The accuracy using classifier RF is the best, while that of classifier SVM with linear kernel is the worst. The impact of channel selection on the Acc is larger. The performance of various channels is very different. PMID:28255330

  6. Comparison of Different Features and Classifiers for Driver Fatigue Detection Based on a Single EEG Channel.

    PubMed

    Hu, Jianfeng

    2017-01-01

    Driver fatigue has become an important factor to traffic accidents worldwide, and effective detection of driver fatigue has major significance for public health. The purpose method employs entropy measures for feature extraction from a single electroencephalogram (EEG) channel. Four types of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE), and spectral entropy (PE), were deployed for the analysis of original EEG signal and compared by ten state-of-the-art classifiers. Results indicate that optimal performance of single channel is achieved using a combination of channel CP4, feature FE, and classifier Random Forest (RF). The highest accuracy can be up to 96.6%, which has been able to meet the needs of real applications. The best combination of channel + features + classifier is subject-specific. In this work, the accuracy of FE as the feature is far greater than the Acc of other features. The accuracy using classifier RF is the best, while that of classifier SVM with linear kernel is the worst. The impact of channel selection on the Acc is larger. The performance of various channels is very different.

  7. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis

    PubMed Central

    Awan, Imtiaz; Aziz, Wajid; Habib, Nazneen; Alowibdi, Jalal S.; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features. PMID:29771977

  8. 15N backbone dynamics of the S-peptide from ribonuclease A in its free and S-protein bound forms: toward a site-specific analysis of entropy changes upon folding.

    PubMed Central

    Alexandrescu, A. T.; Rathgeb-Szabo, K.; Rumpel, K.; Jahnke, W.; Schulthess, T.; Kammerer, R. A.

    1998-01-01

    Backbone 15N relaxation parameters (R1, R2, 1H-15N NOE) have been measured for a 22-residue recombinant variant of the S-peptide in its free and S-protein bound forms. NMR relaxation data were analyzed using the "model-free" approach (Lipari & Szabo, 1982). Order parameters obtained from "model-free" simulations were used to calculate 1H-15N bond vector entropies using a recently described method (Yang & Kay, 1996), in which the form of the probability density function for bond vector fluctuations is derived from a diffusion-in-a-cone motional model. The average change in 1H-15N bond vector entropies for residues T3-S15, which become ordered upon binding of the S-peptide to the S-protein, is -12.6+/-1.4 J/mol.residue.K. 15N relaxation data suggest a gradient of decreasing entropy values moving from the termini toward the center of the free peptide. The difference between the entropies of the terminal and central residues is about -12 J/mol residue K, a value comparable to that of the average entropy change per residue upon complex formation. Similar entropy gradients are evident in NMR relaxation studies of other denatured proteins. Taken together, these observations suggest denatured proteins may contain entropic contributions from non-local interactions. Consequently, calculations that model the entropy of a residue in a denatured protein as that of a residue in a di- or tri-peptide, might over-estimate the magnitude of entropy changes upon folding. PMID:9521116

  9. Analysis of the phase transition in the two-dimensional Ising ferromagnet using a Lempel-Ziv string-parsing scheme and black-box data-compression utilities

    NASA Astrophysics Data System (ADS)

    Melchert, O.; Hartmann, A. K.

    2015-02-01

    In this work we consider information-theoretic observables to analyze short symbolic sequences, comprising time series that represent the orientation of a single spin in a two-dimensional (2D) Ising ferromagnet on a square lattice of size L2=1282 for different system temperatures T . The latter were chosen from an interval enclosing the critical point Tc of the model. At small temperatures the sequences are thus very regular; at high temperatures they are maximally random. In the vicinity of the critical point, nontrivial, long-range correlations appear. Here we implement estimators for the entropy rate, excess entropy (i.e., "complexity"), and multi-information. First, we implement a Lempel-Ziv string-parsing scheme, providing seemingly elaborate entropy rate and multi-information estimates and an approximate estimator for the excess entropy. Furthermore, we apply easy-to-use black-box data-compression utilities, providing approximate estimators only. For comparison and to yield results for benchmarking purposes, we implement the information-theoretic observables also based on the well-established M -block Shannon entropy, which is more tedious to apply compared to the first two "algorithmic" entropy estimation procedures. To test how well one can exploit the potential of such data-compression techniques, we aim at detecting the critical point of the 2D Ising ferromagnet. Among the above observables, the multi-information, which is known to exhibit an isolated peak at the critical point, is very easy to replicate by means of both efficient algorithmic entropy estimation procedures. Finally, we assess how good the various algorithmic entropy estimates compare to the more conventional block entropy estimates and illustrate a simple modification that yields enhanced results.

  10. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis.

    PubMed

    Awan, Imtiaz; Aziz, Wajid; Shah, Imran Hussain; Habib, Nazneen; Alowibdi, Jalal S; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features.

  11. It is not the entropy you produce, rather, how you produce it

    PubMed Central

    Volk, Tyler; Pauluis, Olivier

    2010-01-01

    The principle of maximum entropy production (MEP) seeks to better understand a large variety of the Earth's environmental and ecological systems by postulating that processes far from thermodynamic equilibrium will ‘adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate’. Our aim in this ‘outside view’, invited by Axel Kleidon, is to focus on what we think is an outstanding challenge for MEP and for irreversible thermodynamics in general: making specific predictions about the relative contribution of individual processes to entropy production. Using studies that compared entropy production in the atmosphere of a dry versus humid Earth, we show that two systems might have the same entropy production rate but very different internal dynamics of dissipation. Using the results of several of the papers in this special issue and a thought experiment, we show that components of life-containing systems can evolve to either lower or raise the entropy production rate. Our analysis makes explicit fundamental questions for MEP that should be brought into focus: can MEP predict not just the overall state of entropy production of a system but also the details of the sub-systems of dissipaters within the system? Which fluxes of the system are those that are most likely to be maximized? How it is possible for MEP theory to be so domain-neutral that it can claim to apply equally to both purely physical–chemical systems and also systems governed by the ‘laws’ of biological evolution? We conclude that the principle of MEP needs to take on the issue of exactly how entropy is produced. PMID:20368249

  12. Automated EEG entropy measurements in coma, vegetative state/unresponsive wakefulness syndrome and minimally conscious state

    PubMed Central

    Gosseries, Olivia; Schnakers, Caroline; Ledoux, Didier; Vanhaudenhuyse, Audrey; Bruno, Marie-Aurélie; Demertzi, Athéna; Noirhomme, Quentin; Lehembre, Rémy; Damas, Pierre; Goldman, Serge; Peeters, Erika; Moonen, Gustave; Laureys, Steven

    Summary Monitoring the level of consciousness in brain-injured patients with disorders of consciousness is crucial as it provides diagnostic and prognostic information. Behavioral assessment remains the gold standard for assessing consciousness but previous studies have shown a high rate of misdiagnosis. This study aimed to investigate the usefulness of electroencephalography (EEG) entropy measurements in differentiating unconscious (coma or vegetative) from minimally conscious patients. Left fronto-temporal EEG recordings (10-minute resting state epochs) were prospectively obtained in 56 patients and 16 age-matched healthy volunteers. Patients were assessed in the acute (≤1 month post-injury; n=29) or chronic (>1 month post-injury; n=27) stage. The etiology was traumatic in 23 patients. Automated online EEG entropy calculations (providing an arbitrary value ranging from 0 to 91) were compared with behavioral assessments (Coma Recovery Scale-Revised) and outcome. EEG entropy correlated with Coma Recovery Scale total scores (r=0.49). Mean EEG entropy values were higher in minimally conscious (73±19; mean and standard deviation) than in vegetative/unresponsive wakefulness syndrome patients (45±28). Receiver operating characteristic analysis revealed an entropy cut-off value of 52 differentiating acute unconscious from minimally conscious patients (sensitivity 89% and specificity 90%). In chronic patients, entropy measurements offered no reliable diagnostic information. EEG entropy measurements did not allow prediction of outcome. User-independent time-frequency balanced spectral EEG entropy measurements seem to constitute an interesting diagnostic – albeit not prognostic – tool for assessing neural network complexity in disorders of consciousness in the acute setting. Future studies are needed before using this tool in routine clinical practice, and these should seek to improve automated EEG quantification paradigms in order to reduce the remaining false negative and false positive findings. PMID:21693085

  13. Predicting depressed patients with suicidal ideation from ECG recordings.

    PubMed

    Khandoker, A H; Luthra, V; Abouallaban, Y; Saha, S; Ahmed, K I; Mostafa, R; Chowdhury, N; Jelinek, H F

    2017-05-01

    Globally suicidal behavior is the third most common cause of death among patients with major depressive disorder (MDD). This study presents multi-lag tone-entropy (T-E) analysis of heart rate variability (HRV) as a screening tool for identifying MDD patients with suicidal ideation. Sixty-one ECG recordings (10 min) were acquired and analyzed from control subjects (29 CONT), 16 MDD subjects with (MDDSI+) and 16 without suicidal ideation (MDDSI-). After ECG preprocessing, tone and entropy values were calculated for multiple lags (m: 1-10). The MDDSI+ group was found to have a higher mean tone value compared to that of the MDDSI- group for lags 1-8, whereas the mean entropy value was lower in MDDSI+ than that in CONT group at all lags (1-10). Leave-one-out cross-validation tests, using a classification and regression tree (CART), obtained 94.83 % accuracy in predicting MDDSI+ subjects by using a combination of tone and entropy values at all lags and including demographic factors (age, BMI and waist circumference) compared to results with time and frequency domain HRV analysis. The results of this pilot study demonstrate the usefulness of multi-lag T-E analysis in identifying MDD patients with suicidal ideation and highlight the change in autonomic nervous system modulation of the heart rate associated with depression and suicidal ideation.

  14. A Novel Shape Parameterization Approach

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper presents a novel parameterization approach for complex shapes suitable for a multidisciplinary design optimization application. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft objects animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in a similar manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminated plate structures) and high-fidelity analysis tools (e.g., nonlinear computational fluid dynamics and detailed finite element modeling). This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, and camber. The results are presented for a multidisciplinary design optimization application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, performance, and a simple propulsion module.

  15. A subdivision-based parametric deformable model for surface extraction and statistical shape modeling of the knee cartilages

    NASA Astrophysics Data System (ADS)

    Fripp, Jurgen; Crozier, Stuart; Warfield, Simon K.; Ourselin, Sébastien

    2006-03-01

    Subdivision surfaces and parameterization are desirable for many algorithms that are commonly used in Medical Image Analysis. However, extracting an accurate surface and parameterization can be difficult for many anatomical objects of interest, due to noisy segmentations and the inherent variability of the object. The thin cartilages of the knee are an example of this, especially after damage is incurred from injuries or conditions like osteoarthritis. As a result, the cartilages can have different topologies or exist in multiple pieces. In this paper we present a topology preserving (genus 0) subdivision-based parametric deformable model that is used to extract the surfaces of the patella and tibial cartilages in the knee. These surfaces have minimal thickness in areas without cartilage. The algorithm inherently incorporates several desirable properties, including: shape based interpolation, sub-division remeshing and parameterization. To illustrate the usefulness of this approach, the surfaces and parameterizations of the patella cartilage are used to generate a 3D statistical shape model.

  16. Error in Radar-Derived Soil Moisture due to Roughness Parameterization: An Analysis Based on Synthetical Surface Profiles

    PubMed Central

    Lievens, Hans; Vernieuwe, Hilde; Álvarez-Mozos, Jesús; De Baets, Bernard; Verhoest, Niko E.C.

    2009-01-01

    In the past decades, many studies on soil moisture retrieval from SAR demonstrated a poor correlation between the top layer soil moisture content and observed backscatter coefficients, which mainly has been attributed to difficulties involved in the parameterization of surface roughness. The present paper describes a theoretical study, performed on synthetical surface profiles, which investigates how errors on roughness parameters are introduced by standard measurement techniques, and how they will propagate through the commonly used Integral Equation Model (IEM) into a corresponding soil moisture retrieval error for some of the currently most used SAR configurations. Key aspects influencing the error on the roughness parameterization and consequently on soil moisture retrieval are: the length of the surface profile, the number of profile measurements, the horizontal and vertical accuracy of profile measurements and the removal of trends along profiles. Moreover, it is found that soil moisture retrieval with C-band configuration generally is less sensitive to inaccuracies in roughness parameterization than retrieval with L-band configuration. PMID:22399956

  17. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal

    PubMed Central

    Mohapatra, Biswajit

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis. PMID:29854361

  18. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal.

    PubMed

    Nayak, Suraj K; Bit, Arindam; Dey, Anilesh; Mohapatra, Biswajit; Pal, Kunal

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis.

  19. Permutation entropy and statistical complexity analysis of turbulence in laboratory plasmas and the solar wind.

    PubMed

    Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T

    2015-02-01

    The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.

  20. Entropy-Based Adaptive Nuclear Texture Features are Independent Prognostic Markers in a Total Population of Uterine Sarcomas

    PubMed Central

    Nielsen, Birgitte; Hveem, Tarjei Sveinsgjerd; Kildal, Wanja; Abeler, Vera M; Kristensen, Gunnar B; Albregtsen, Fritz; Danielsen, Håvard E; Rohde, Gustavo K

    2015-01-01

    Nuclear texture analysis measures the spatial arrangement of the pixel gray levels in a digitized microscopic nuclear image and is a promising quantitative tool for prognosis of cancer. The aim of this study was to evaluate the prognostic value of entropy-based adaptive nuclear texture features in a total population of 354 uterine sarcomas. Isolated nuclei (monolayers) were prepared from 50 µm tissue sections and stained with Feulgen-Schiff. Local gray level entropy was measured within small windows of each nuclear image and stored in gray level entropy matrices, and two superior adaptive texture features were calculated from each matrix. The 5-year crude survival was significantly higher (P < 0.001) for patients with high texture feature values (72%) than for patients with low feature values (36%). When combining DNA ploidy classification (diploid/nondiploid) and texture (high/low feature value), the patients could be stratified into three risk groups with 5-year crude survival of 77, 57, and 34% (Hazard Ratios (HR) of 1, 2.3, and 4.1, P < 0.001). Entropy-based adaptive nuclear texture was an independent prognostic marker for crude survival in multivariate analysis including relevant clinicopathological features (HR = 2.1, P = 0.001), and should therefore be considered as a potential prognostic marker in uterine sarcomas. © The Authors. Published 2014 International Society for Advancement of Cytometry PMID:25483227

  1. An Information Transmission Measure for the Analysis of Effective Connectivity among Cortical Neurons

    PubMed Central

    Law, Andrew J.; Sharma, Gaurav; Schieber, Marc H.

    2014-01-01

    We present a methodology for detecting effective connections between simultaneously recorded neurons using an information transmission measure to identify the presence and direction of information flow from one neuron to another. Using simulated and experimentally-measured data, we evaluate the performance of our proposed method and compare it to the traditional transfer entropy approach. In simulations, our measure of information transmission outperforms transfer entropy in identifying the effective connectivity structure of a neuron ensemble. For experimentally recorded data, where ground truth is unavailable, the proposed method also yields a more plausible connectivity structure than transfer entropy. PMID:21096617

  2. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  3. A volumetric conformal mapping approach for clustering white matter fibers in the brain

    PubMed Central

    Gupta, Vikash; Prasad, Gautam; Thompson, Paul

    2017-01-01

    The human brain may be considered as a genus-0 shape, topologically equivalent to a sphere. Various methods have been used in the past to transform the brain surface to that of a sphere using harmonic energy minimization methods used for cortical surface matching. However, very few methods have studied volumetric parameterization of the brain using a spherical embedding. Volumetric parameterization is typically used for complicated geometric problems like shape matching, morphing and isogeometric analysis. Using conformal mapping techniques, we can establish a bijective mapping between the brain and the topologically equivalent sphere. Our hypothesis is that shape analysis problems are simplified when the shape is defined in an intrinsic coordinate system. Our goal is to establish such a coordinate system for the brain. The efficacy of the method is demonstrated with a white matter clustering problem. Initial results show promise for future investigation in these parameterization technique and its application to other problems related to computational anatomy like registration and segmentation. PMID:29177252

  4. Entropy Information of Cardiorespiratory Dynamics in Neonates during Sleep.

    PubMed

    Lucchini, Maristella; Pini, Nicolò; Fifer, William P; Burtchen, Nina; Signorini, Maria G

    2017-05-01

    Sleep is a central activity in human adults and characterizes most of the newborn infant life. During sleep, autonomic control acts to modulate heart rate variability (HRV) and respiration. Mechanisms underlying cardiorespiratory interactions in different sleep states have been studied but are not yet fully understood. Signal processing approaches have focused on cardiorespiratory analysis to elucidate this co-regulation. This manuscript proposes to analyze heart rate (HR), respiratory variability and their interrelationship in newborn infants to characterize cardiorespiratory interactions in different sleep states (active vs. quiet). We are searching for indices that could detect regulation alteration or malfunction, potentially leading to infant distress. We have analyzed inter-beat (RR) interval series and respiration in a population of 151 newborns, and followed up with 33 at 1 month of age. RR interval series were obtained by recognizing peaks of the QRS complex in the electrocardiogram (ECG), corresponding to the ventricles depolarization. Univariate time domain, frequency domain and entropy measures were applied. In addition, Transfer Entropy was considered as a bivariate approach able to quantify the bidirectional information flow from one signal (respiration) to another (RR series). Results confirm the validity of the proposed approach. Overall, HRV is higher in active sleep, while high frequency (HF) power characterizes more quiet sleep. Entropy analysis provides higher indices for SampEn and Quadratic Sample entropy (QSE) in quiet sleep. Transfer Entropy values were higher in quiet sleep and point to a major influence of respiration on the RR series. At 1 month of age, time domain parameters show an increase in HR and a decrease in variability. No entropy differences were found across ages. The parameters employed in this study help to quantify the potential for infants to adapt their cardiorespiratory responses as they mature. Thus, they could be useful as early markers of risk for infant cardiorespiratory vulnerabilities.

  5. Respiration-Averaged CT for Attenuation Correction of PET Images – Impact on PET Texture Features in Non-Small Cell Lung Cancer Patients

    PubMed Central

    Cheng, Nai-Ming; Fang, Yu-Hua Dean; Tsan, Din-Li

    2016-01-01

    Purpose We compared attenuation correction of PET images with helical CT (PET/HCT) and respiration-averaged CT (PET/ACT) in patients with non-small-cell lung cancer (NSCLC) with the goal of investigating the impact of respiration-averaged CT on 18F FDG PET texture parameters. Materials and Methods A total of 56 patients were enrolled. Tumors were segmented on pretreatment PET images using the adaptive threshold. Twelve different texture parameters were computed: standard uptake value (SUV) entropy, uniformity, entropy, dissimilarity, homogeneity, coarseness, busyness, contrast, complexity, grey-level nonuniformity, zone-size nonuniformity, and high grey-level large zone emphasis. Comparisons of PET/HCT and PET/ACT were performed using Wilcoxon signed-rank tests, intraclass correlation coefficients, and Bland-Altman analysis. Receiver operating characteristic (ROC) curves as well as univariate and multivariate Cox regression analyses were used to identify the parameters significantly associated with disease-specific survival (DSS). A fixed threshold at 45% of the maximum SUV (T45) was used for validation. Results SUV maximum and total lesion glycolysis (TLG) were significantly higher in PET/ACT. However, texture parameters obtained with PET/ACT and PET/HCT showed a high degree of agreement. The lowest levels of variation between the two modalities were observed for SUV entropy (9.7%) and entropy (9.8%). SUV entropy, entropy, and coarseness from both PET/ACT and PET/HCT were significantly associated with DSS. Validation analyses using T45 confirmed the usefulness of SUV entropy and entropy in both PET/HCT and PET/ACT for the prediction of DSS, but only coarseness from PET/ACT achieved the statistical significance threshold. Conclusions Our results indicate that 1) texture parameters from PET/ACT are clinically useful in the prediction of survival in NSCLC patients and 2) SUV entropy and entropy are robust to attenuation correction methods. PMID:26930211

  6. Using multi-scale entropy and principal component analysis to monitor gears degradation via the motor current signature analysis

    NASA Astrophysics Data System (ADS)

    Aouabdi, Salim; Taibi, Mahmoud; Bouras, Slimane; Boutasseta, Nadir

    2017-06-01

    This paper describes an approach for identifying localized gear tooth defects, such as pitting, using phase currents measured from an induction machine driving the gearbox. A new tool of anomaly detection based on multi-scale entropy (MSE) algorithm SampEn which allows correlations in signals to be identified over multiple time scales. The motor current signature analysis (MCSA) in conjunction with principal component analysis (PCA) and the comparison of observed values with those predicted from a model built using nominally healthy data. The Simulation results show that the proposed method is able to detect gear tooth pitting in current signals.

  7. Discovering the Thermodynamics of Simultaneous Equilibria: An Entropy Analysis Activity Involving Consecutive Equilibria

    ERIC Educational Resources Information Center

    Bindel, Thomas H.

    2007-01-01

    An activity is presented in which the thermodynamics of simultaneous, consecutive equilibria are explored. The activity is appropriate for second-year high school or AP chemistry. Students discover that a reactant-favored (entropy-diminishing or endergonic) reaction can be caused to happen if it is coupled with a product-favored reaction of…

  8. Carnot to Clausius: Caloric to Entropy

    ERIC Educational Resources Information Center

    Newburgh, Ronald

    2009-01-01

    This paper discusses how the Carnot engine led to the formulation of the second law of thermodynamics and entropy. The operation of the engine is analysed both in terms of heat as the caloric fluid and heat as a form of energy. A keystone of Carnot's thinking was the absolute conservation of caloric. Although the Carnot analysis was partly…

  9. A parameterized logarithmic image processing method with Laplacian of Gaussian filtering for lung nodule enhancement in chest radiographs.

    PubMed

    Chen, Sheng; Yao, Liping; Chen, Bao

    2016-11-01

    The enhancement of lung nodules in chest radiographs (CXRs) plays an important role in the manual as well as computer-aided detection (CADe) lung cancer. In this paper, we proposed a parameterized logarithmic image processing (PLIP) method combined with the Laplacian of a Gaussian (LoG) filter to enhance lung nodules in CXRs. We first applied several LoG filters with varying parameters to an original CXR to enhance the nodule-like structures as well as the edges in the image. We then applied the PLIP model, which can enhance lung nodule images with high contrast and was beneficial in extracting effective features for nodule detection in the CADe scheme. Our method combined the advantages of both the PLIP algorithm and the LoG algorithm, which can enhance lung nodules in chest radiographs with high contrast. To test our nodule enhancement method, we tested a CADe scheme, with a relatively high performance in nodule detection, using a publically available database containing 140 nodules in 140 CXRs enhanced through our nodule enhancement method. The CADe scheme attained a sensitivity of 81 and 70 % with an average of 5.0 frame rate (FP) and 2.0 FP, respectively, in a leave-one-out cross-validation test. By contrast, the CADe scheme based on the original image recorded a sensitivity of 77 and 63 % at 5.0 FP and 2.0 FP, respectively. We introduced the measurement of enhancement by entropy evaluation to objectively assess our method. Experimental results show that the proposed method obtains an effective enhancement of lung nodules in CXRs for both radiologists and CADe schemes.

  10. NEW EQUATIONS OF STATE IN SIMULATIONS OF CORE-COLLAPSE SUPERNOVAE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hempel, M.; Liebendoerfer, M.; Fischer, T.

    2012-03-20

    We discuss three new equations of state (EOS) in core-collapse supernova simulations. The new EOS are based on the nuclear statistical equilibrium model of Hempel and Schaffner-Bielich (HS), which includes excluded volume effects and relativistic mean-field (RMF) interactions. We consider the RMF parameterizations TM1, TMA, and FSUgold. These EOS are implemented into our spherically symmetric core-collapse supernova model, which is based on general relativistic radiation hydrodynamics and three-flavor Boltzmann neutrino transport. The results obtained for the new EOS are compared with the widely used EOS of H. Shen et al. and Lattimer and Swesty. The systematic comparison shows that themore » model description of inhomogeneous nuclear matter is as important as the parameterization of the nuclear interactions for the supernova dynamics and the neutrino signal. Furthermore, several new aspects of nuclear physics are investigated: the HS EOS contains distributions of nuclei, including nuclear shell effects. The appearance of light nuclei, e.g., deuterium and tritium, is also explored, which can become as abundant as alphas and free protons. In addition, we investigate the black hole formation in failed core-collapse supernovae, which is mainly determined by the high-density EOS. We find that temperature effects lead to a systematically faster collapse for the non-relativistic LS EOS in comparison with the RMF EOS. We deduce a new correlation for the time until black hole formation, which allows the determination of the maximum mass of proto-neutron stars, if the neutrino signal from such a failed supernova would be measured in the future. This would give a constraint for the nuclear EOS at finite entropy, complementary to observations of cold neutron stars.« less

  11. Multifractal characteristics of multiparticle production in heavy-ion collisions at SPS energies

    NASA Astrophysics Data System (ADS)

    Khan, Shaista; Ahmad, Shakeel

    Entropy, dimensions and other multifractal characteristics of multiplicity distributions of relativistic charged hadrons produced in ion-ion collisions at SPS energies are investigated. The analysis of the experimental data is carried out in terms of phase space bin-size dependence of multiplicity distributions following the Takagi’s approach. Yet another method is also followed to study the multifractality which, is not related to the bin-width and (or) the detector resolution, rather involves multiplicity distribution of charged particles in full phase space in terms of information entropy and its generalization, Rényi’s order-q information entropy. The findings reveal the presence of multifractal structure — a remarkable property of the fluctuations. Nearly constant values of multifractal specific heat “c” estimated by the two different methods of analysis followed indicate that the parameter “c” may be used as a universal characteristic of the particle production in high energy collisions. The results obtained from the analysis of the experimental data agree well with the predictions of Monte Carlo model AMPT.

  12. Reduced Data Dualscale Entropy Analysis of HRV Signals for Improved Congestive Heart Failure Detection

    NASA Astrophysics Data System (ADS)

    Kuntamalla, Srinivas; Lekkala, Ram Gopal Reddy

    2014-10-01

    Heart rate variability (HRV) is an important dynamic variable of the cardiovascular system, which operates on multiple time scales. In this study, Multiscale entropy (MSE) analysis is applied to HRV signals taken from Physiobank to discriminate Congestive Heart Failure (CHF) patients from healthy young and elderly subjects. The discrimination power of the MSE method is decreased as the amount of the data reduces and the lowest amount of the data at which there is a clear discrimination between CHF and normal subjects is found to be 4000 samples. Further, this method failed to discriminate CHF from healthy elderly subjects. In view of this, the Reduced Data Dualscale Entropy Analysis method is proposed to reduce the data size required (as low as 500 samples) for clearly discriminating the CHF patients from young and elderly subjects with only two scales. Further, an easy to interpret index is derived using this new approach for the diagnosis of CHF. This index shows 100 % accuracy and correlates well with the pathophysiology of heart failure.

  13. Memory and betweenness preference in temporal networks induced from time series

    NASA Astrophysics Data System (ADS)

    Weng, Tongfeng; Zhang, Jie; Small, Michael; Zheng, Rui; Hui, Pan

    2017-02-01

    We construct temporal networks from time series via unfolding the temporal information into an additional topological dimension of the networks. Thus, we are able to introduce memory entropy analysis to unravel the memory effect within the considered signal. We find distinct patterns in the entropy growth rate of the aggregate network at different memory scales for time series with different dynamics ranging from white noise, 1/f noise, autoregressive process, periodic to chaotic dynamics. Interestingly, for a chaotic time series, an exponential scaling emerges in the memory entropy analysis. We demonstrate that the memory exponent can successfully characterize bifurcation phenomenon, and differentiate the human cardiac system in healthy and pathological states. Moreover, we show that the betweenness preference analysis of these temporal networks can further characterize dynamical systems and separate distinct electrocardiogram recordings. Our work explores the memory effect and betweenness preference in temporal networks constructed from time series data, providing a new perspective to understand the underlying dynamical systems.

  14. Analysis of the effect of repeated-pulse transcranial magnetic stimulation at the Guangming point on electroencephalograms.

    PubMed

    Zhang, Xin; Fu, Lingdi; Geng, Yuehua; Zhai, Xiang; Liu, Yanhua

    2014-03-01

    Here, we administered repeated-pulse transcranial magnetic stimulation to healthy people at the left Guangming (GB37) and a mock point, and calculated the sample entropy of electroencephalo-gram signals using nonlinear dynamics. Additionally, we compared electroencephalogram sample entropy of signals in response to visual stimulation before, during, and after repeated-pulse tran-scranial magnetic stimulation at the Guangming. Results showed that electroencephalogram sample entropy at left (F3) and right (FP2) frontal electrodes were significantly different depending on where the magnetic stimulation was administered. Additionally, compared with the mock point, electroencephalogram sample entropy was higher after stimulating the Guangming point. When visual stimulation at Guangming was given before repeated-pulse transcranial magnetic stimula-tion, significant differences in sample entropy were found at five electrodes (C3, Cz, C4, P3, T8) in parietal cortex, the central gyrus, and the right temporal region compared with when it was given after repeated-pulse transcranial magnetic stimulation, indicating that repeated-pulse transcranial magnetic stimulation at Guangming can affect visual function. Analysis of electroencephalogram revealed that when visual stimulation preceded repeated pulse transcranial magnetic stimulation, sample entropy values were higher at the C3, C4, and P3 electrodes and lower at the Cz and T8 electrodes than visual stimulation followed preceded repeated pulse transcranial magnetic stimula-tion. The findings indicate that repeated-pulse transcranial magnetic stimulation at the Guangming evokes different patterns of electroencephalogram signals than repeated-pulse transcranial mag-netic stimulation at other nearby points on the body surface, and that repeated-pulse transcranial magnetic stimulation at the Guangming is associated with changes in the complexity of visually evoked electroencephalogram signals in parietal regions, central gyrus, and temporal regions.

  15. Heat transfer enhancement and entropy generation analysis of Al2O3-water nanofluid in an alternating oval cross-section tube using two-phase mixture model under turbulent flow

    NASA Astrophysics Data System (ADS)

    Najafi Khaboshan, Hasan; Nazif, Hamid Reza

    2018-04-01

    Heat transfer and turbulent flow of Al2O3-water nanofluid within alternating oval cross-section tube are numerically simulated using Eulerian-Eulerian two-phase mixture model. The primary goal of the present study is to investigate the effects of nanoparticles volume fraction, nanoparticles diameter and different inlet velocities on heat transfer, pressure drop and entropy generation characteristics of the alternating oval cross-section tube. For numerical simulation validation, the numerical results were compared with experimental data. Also, constant wall temperature boundary condition was considered on the tube wall. In addition, the comparison of thermal-hydraulic performance and the entropy generation characteristics between alternating oval cross-section tube and circular tube under same fluids were done. The results show that the heat transfer coefficient and pressure drop of alternating oval cross-section tube is more than base tube under same fluids. Also, these two parameters are increased when adding Al2O3 nanoparticle into water fluid, at any inlet velocity for both tubes. Furthermore, compared to the base fluid, the value of the heat transfer enhancement of nanofluid is higher than the increase of friction factor of nanofluid at the same given inlet boundary conditions. The results of entropy generation analysis illustrate that the total entropy generation increase with increasing the nanoparticles volume fraction and decreasing the nanoparticles diameter of nanofluid. The generation of thermal entropy is the main part of irreversibility, and Bejan number with an increase of the nanoparticles diameter slightly increases. Finally, at any given inlet velocity the frictional irreversibility is grown with an increase the nanoparticles volume fraction.

  16. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    PubMed

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  17. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series

    PubMed Central

    Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C.; Golino, Caroline A.; Kemper, Peter; Saha, Margaret S.

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features. PMID:27977764

  18. Assessment of risk of femoral neck fracture with radiographic texture parameters: a retrospective study.

    PubMed

    Thevenot, Jérôme; Hirvasniemi, Jukka; Pulkkinen, Pasi; Määttä, Mikko; Korpelainen, Raija; Saarakkala, Simo; Jämsä, Timo

    2014-07-01

    To investigate whether femoral neck fracture can be predicted retrospectively on the basis of clinical radiographs by using the combined analysis of bone geometry, textural analysis of trabecular bone, and bone mineral density (BMD). Formal ethics committee approval was obtained for the study, and all participants gave informed written consent. Pelvic radiographs and proximal femur BMD measurements were obtained in 53 women aged 79-82 years in 2006. By 2012, 10 of these patients had experienced a low-impact femoral neck fracture. A Laplacian-based semiautomatic custom algorithm was applied to the radiographs to calculate the texture parameters along the trabecular fibers in the lower neck area for all subjects. Intra- and interobserver reproducibility was calculated by using the root mean square average coefficient of variation to evaluate the robustness of the method. The best predictors of hip fracture were entropy (P = .007; reproducibility coefficient of variation < 1%), the neck-shaft angle (NSA) (P = .017), and the BMD (P = .13). For prediction of fracture, the area under the receiver operating characteristic curve was 0.753 for entropy, 0.608 for femoral neck BMD, and 0.698 for NSA. The area increased to 0.816 when entropy and NSA were combined and to 0.902 when entropy, NSA, and BMD were combined. Textural analysis of pelvic radiographs enables discrimination of patients at risk for femoral neck fracture, and our results show the potential of this conventional imaging method to yield better prediction than that achieved with dual-energy x-ray absorptiometry-based BMD. The combination of the entropy parameter with NSA and BMD can further enhance predictive accuracy. © RSNA, 2014.

  19. The effect of dexmedetomidine continuous infusion as an adjuvant to general anesthesia on sevoflurane requirements: A study based on entropy analysis

    PubMed Central

    Patel, Chirag Ramanlal; Engineer, Smita R; Shah, Bharat J; Madhu, S

    2013-01-01

    Background: Dexmedetomidine, a α2 agonist as an adjuvant in general anesthesia, has anesthetic and analgesic-sparing property. Aims: To evaluate the effect of continuous infusion of dexmedetomidine alone, without use of opioids, on requirement of sevoflurane during general anesthesia with continuous monitoring of depth of anesthesia by entropy analysis. Materials and Methods: Sixty patients were randomly divided into 2 groups of 30 each. In group A, fentanyl 2 mcg/kg was given while in group B, dexmedetomidine was given intravenously as loading dose of 1 mcg/kg over 10 min prior to induction. After induction with thiopentone in group B, dexmedetomidine was given as infusion at a dose of 0.2-0.8 mcg/kg. Sevoflurane was used as inhalation agent in both groups. Hemodynamic variables, sevoflurane inspired fraction (FIsevo), sevoflurane expired fraction (ETsevo), and entropy (Response entropy and state entropy) were continuously recorded. Statistical analysis was done by unpaired student's t-test and Chi-square test for continuous and categorical variables, respectively. A P-value < 0.05 was considered significant. Results: The use of dexmedetomidine with sevoflurane was associated with a statistical significant decrease in ETsevo at 5 minutes post-intubation (1.49 ± 0.11) and 60 minutes post-intubation (1.11 ±0.28) as compared to the group A [1.73 ±0.30 (5 minutes); 1.68 ±0.50 (60 minutes)]. There was an average 21.5% decrease in ETsevo in group B as compared to group A. Conclusions: Dexmedetomidine, as an adjuvant in general anesthesia, decreases requirement of sevoflurane for maintaining adequate depth of anesthesia. PMID:24106354

  20. Identification of genome regions determining semen quality in Holstein-Friesian bulls using information theory.

    PubMed

    Borowska, Alicja; Szwaczkowski, Tomasz; Kamiński, Stanisław; Hering, Dorota M; Kordan, Władysław; Lecewicz, Marek

    2018-05-01

    Use of information theory can be an alternative statistical approach to detect genome regions and candidate genes that are associated with livestock traits. The aim of this study was to verify the validity of the SNPs effects on some semen quality variables of bulls using entropy analysis. Records from 288 Holstein-Friesian bulls from one AI station were included. The following semen quality variables were analyzed: CASA kinematic variables of sperm (total motility, average path velocity, straight line velocity, curvilinear velocity, amplitude of lateral head displacement, beat cross frequency, straightness, linearity), sperm membrane integrity (plazmolema, mitochondrial function), sperm ATP content. Molecular data included 48,192 SNPs. After filtering (call rate = 0.95 and MAF = 0.05), 34,794 SNPs were included in the entropy analysis. The entropy and conditional entropy were estimated for each SNP. Conditional entropy quantifies the remaining uncertainty about values of the variable with the knowledge of SNP. The most informative SNPs for each variable were determined. The computations were performed using the R statistical package. A majority of the loci had relatively small contributions. The most informative SNPs for all variables were mainly located on chromosomes: 3, 4, 5 and 16. The results from the study indicate that important genome regions and candidate genes that determine semen quality variables in bulls are located on a number of chromosomes. Some detected clusters of SNPs were located in RNA (U6 and 5S_rRNA) for all the variables for which analysis occurred. Associations between PARK2 as well GALNT13 genes and some semen characteristics were also detected. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Strong parameterization and coordination encirclements of graph of Penrose tiling vertices

    NASA Astrophysics Data System (ADS)

    Shutov, A. V.; Maleev, A. V.

    2017-07-01

    The coordination encirclements in a graph of Penrose tiling vertices have been investigated based on the analysis of vertice parameters. A strong parameterization of these vertices is developed in the form of a tiling of a parameter set in the region corresponding to different first coordination encirclements of vertices. An algorithm for constructing tilings of a set of parameters determining different coordination encirclements in a graph of Penrose tiling vertices of order n is proposed.

  2. Interpreting activity in H(2)O-H(2)SO(4) binary nucleation.

    PubMed

    Bein, Keith J; Wexler, Anthony S

    2007-09-28

    Sulfuric acid-water nucleation is thought to be a key atmospheric mechanism for forming new condensation nuclei. In earlier literature, measurements of sulfuric acid activity were interpreted as the total (monomer plus hydrate) concentration above solution. Due to recent reinterpretations, most literature values for H(2)SO(4) activity are thought to represent the number density of monomers. Based on this reinterpretation, the current work uses the most recent models of H(2)O-H(2)SO(4) binary nucleation along with perturbation analyses to predict a decrease in critical cluster mole fraction, increase in critical cluster diameter, and orders of magnitude decrease in nucleation rate. Nucleation rate parameterizations available in the literature, however, give opposite trends. To resolve these discrepancies, nucleation rates were calculated for both interpretations of H(2)SO(4) activity and directly compared to the available parameterizations as well as the perturbation analysis. Results were in excellent agreement with older parameterizations that assumed H(2)SO(4) activity represents the total concentration and duplicated the predicted trends from the perturbation analysis, but differed by orders of magnitude from more recent parameterizations that assume H(2)SO(4) activity represents only the monomer. Comparison with experimental measurements available in the literature revealed that the calculations of the current work assuming a(a) represents the total concentration are most frequently in agreement with observations.

  3. Improved Overpressure Recording and Modeling for Near-Surface Explosion Forensics

    NASA Astrophysics Data System (ADS)

    Kim, K.; Schnurr, J.; Garces, M. A.; Rodgers, A. J.

    2017-12-01

    The accurate recording and analysis of air-blast acoustic waveforms is a key component of the forensic analysis of explosive events. Smartphone apps can enhance traditional technologies by providing scalable, cost-effective ubiquitous sensor solutions for monitoring blasts, undeclared activities, and inaccessible facilities. During a series of near-surface chemical high explosive tests, iPhone 6's running the RedVox infrasound recorder app were co-located with high-fidelity Hyperion overpressure sensors, allowing for direct comparison of the resolution and frequency content of the devices. Data from the traditional sensors is used to characterize blast signatures and to determine relative iPhone microphone amplitude and phase responses. A Wiener filter based source deconvolution method is applied, using a parameterized source function estimated from traditional overpressure sensor data, to estimate system responses. In addition, progress on a new parameterized air-blast model is presented. The model is based on the analysis of a large set of overpressure waveforms from several surface explosion test series. An appropriate functional form with parameters determined empirically from modern air-blast and acoustic data will allow for better parameterization of signals and the improved characterization of explosive sources.

  4. How hidden are hidden processes? A primer on crypticity and entropy convergence

    NASA Astrophysics Data System (ADS)

    Mahoney, John R.; Ellison, Christopher J.; James, Ryan G.; Crutchfield, James P.

    2011-09-01

    We investigate a stationary process's crypticity—a measure of the difference between its hidden state information and its observed information—using the causal states of computational mechanics. Here, we motivate crypticity and cryptic order as physically meaningful quantities that monitor how hidden a hidden process is. This is done by recasting previous results on the convergence of block entropy and block-state entropy in a geometric setting, one that is more intuitive and that leads to a number of new results. For example, we connect crypticity to how an observer synchronizes to a process. We show that the block-causal-state entropy is a convex function of block length. We give a complete analysis of spin chains. We present a classification scheme that surveys stationary processes in terms of their possible cryptic and Markov orders. We illustrate related entropy convergence behaviors using a new form of foliated information diagram. Finally, along the way, we provide a variety of interpretations of crypticity and cryptic order to establish their naturalness and pervasiveness. This is also a first step in developing applications in spatially extended and network dynamical systems.

  5. Design of new face-centered cubic high entropy alloys by thermodynamic calculation

    NASA Astrophysics Data System (ADS)

    Choi, Won-Mi; Jung, Seungmun; Jo, Yong Hee; Lee, Sunghak; Lee, Byeong-Joo

    2017-09-01

    A new face-centered cubic (fcc) high entropy alloy system with non-equiatomic compositions has been designed by utilizing a CALculation of PHAse Diagram (CALPHAD) - type thermodynamic calculation technique. The new alloy system is based on the representative fcc high entropy alloy, the Cantor alloy which is an equiatomic Co- Cr-Fe-Mn-Ni five-component alloy, but fully or partly replace the cobalt by vanadium and is of non-equiatomic compositions. Alloy compositions expected to have an fcc single-phase structure between 700 °C and melting temperatures are proposed. All the proposed alloys are experimentally confirmed to have the fcc single-phase during materials processes (> 800 °C), through an X-ray diffraction analysis. It is shown that there are more chances to find fcc single-phase high entropy alloys if paying attention to non-equiatomic composition regions and that the CALPHAD thermodynamic calculation can be an efficient tool for it. An alloy design technique based on thermodynamic calculation is demonstrated and the applicability and limitation of the approach as a design tool for high entropy alloys is discussed.

  6. Optimality and inference in hydrology from entropy production considerations: synthetic hillslope numerical experiments

    NASA Astrophysics Data System (ADS)

    Kollet, S. J.

    2015-05-01

    In this study, entropy production optimization and inference principles are applied to a synthetic semi-arid hillslope in high-resolution, physics-based simulations. The results suggest that entropy or power is indeed maximized, because of the strong nonlinearity of variably saturated flow and competing processes related to soil moisture fluxes, the depletion of gradients, and the movement of a free water table. Thus, it appears that the maximum entropy production (MEP) principle may indeed be applicable to hydrologic systems. In the application to hydrologic system, the free water table constitutes an important degree of freedom in the optimization of entropy production and may also relate the theory to actual observations. In an ensuing analysis, an attempt is made to transfer the complex, "microscopic" hillslope model into a macroscopic model of reduced complexity using the MEP principle as an interference tool to obtain effective conductance coefficients and forces/gradients. The results demonstrate a new approach for the application of MEP to hydrologic systems and may form the basis for fruitful discussions and research in future.

  7. Competition between Homophily and Information Entropy Maximization in Social Networks

    PubMed Central

    Zhao, Jichang; Liang, Xiao; Xu, Ke

    2015-01-01

    In social networks, it is conventionally thought that two individuals with more overlapped friends tend to establish a new friendship, which could be stated as homophily breeding new connections. While the recent hypothesis of maximum information entropy is presented as the possible origin of effective navigation in small-world networks. We find there exists a competition between information entropy maximization and homophily in local structure through both theoretical and experimental analysis. This competition suggests that a newly built relationship between two individuals with more common friends would lead to less information entropy gain for them. We demonstrate that in the evolution of the social network, both of the two assumptions coexist. The rule of maximum information entropy produces weak ties in the network, while the law of homophily makes the network highly clustered locally and the individuals would obtain strong and trust ties. A toy model is also presented to demonstrate the competition and evaluate the roles of different rules in the evolution of real networks. Our findings could shed light on the social network modeling from a new perspective. PMID:26334994

  8. Regional tectonic analysis of Venus equatorial highlands and comparison with Earth-based Magellan radar images

    NASA Technical Reports Server (NTRS)

    Williams, David R.; Wetherill, George

    1993-01-01

    Research on regional tectonic analysis of Venus equatorial highlands and comparison with earth-based and Magellan radar images is presented. Over the past two years, the tectonic analysis of Venus performed centered on global properties of the planet, in order to understand fundamental aspects of the dynamics of the mantle and lithosphere of Venus. These include studies pertaining to the original constitutive and thermal character of the planet, as well as the evolution of Venus through time, and the present day tectonics. Parameterized convection models of the Earth and Venus were developed. The parameterized convection code was reformulated to model Venus with an initially hydrous mantle to determine how the cold-trap could affect the evolution of the planet.

  9. Optimization of Analytical Potentials for Coarse-Grained Biopolymer Models.

    PubMed

    Mereghetti, Paolo; Maccari, Giuseppe; Spampinato, Giulia Lia Beatrice; Tozzini, Valentina

    2016-08-25

    The increasing trend in the recent literature on coarse grained (CG) models testifies their impact in the study of complex systems. However, the CG model landscape is variegated: even considering a given resolution level, the force fields are very heterogeneous and optimized with very different parametrization procedures. Along the road for standardization of CG models for biopolymers, here we describe a strategy to aid building and optimization of statistics based analytical force fields and its implementation in the software package AsParaGS (Assisted Parameterization platform for coarse Grained modelS). Our method is based on the use and optimization of analytical potentials, optimized by targeting internal variables statistical distributions by means of the combination of different algorithms (i.e., relative entropy driven stochastic exploration of the parameter space and iterative Boltzmann inversion). This allows designing a custom model that endows the force field terms with a physically sound meaning. Furthermore, the level of transferability and accuracy can be tuned through the choice of statistical data set composition. The method-illustrated by means of applications to helical polypeptides-also involves the analysis of two and three variable distributions, and allows handling issues related to the FF term correlations. AsParaGS is interfaced with general-purpose molecular dynamics codes and currently implements the "minimalist" subclass of CG models (i.e., one bead per amino acid, Cα based). Extensions to nucleic acids and different levels of coarse graining are in the course.

  10. Classifying the Quantum Phases of Matter

    DTIC Science & Technology

    2015-01-01

    Kim related entanglement entropy to topological storage of quantum information [8]. Michalakis et al. showed that a particle-like excitation spectrum...Perturbative analysis of topological entanglement entropy from conditional independence, Phys. Rev. B 86, 254116 (2012), arXiv:1210.2360. [3] I. Kim...symmetries or long-range entanglement ), (2) elucidating the properties of three-dimensional quantum codes (in particular those which admit no string-like

  11. An analytical coarse-graining method which preserves the free energy, structural correlations, and thermodynamic state of polymer melts from the atomistic to the mesoscale.

    PubMed

    McCarty, J; Clark, A J; Copperman, J; Guenza, M G

    2014-05-28

    Structural and thermodynamic consistency of coarse-graining models across multiple length scales is essential for the predictive role of multi-scale modeling and molecular dynamic simulations that use mesoscale descriptions. Our approach is a coarse-grained model based on integral equation theory, which can represent polymer chains at variable levels of chemical details. The model is analytical and depends on molecular and thermodynamic parameters of the system under study, as well as on the direct correlation function in the k → 0 limit, c0. A numerical solution to the PRISM integral equations is used to determine c0, by adjusting the value of the effective hard sphere diameter, dHS, to agree with the predicted equation of state. This single quantity parameterizes the coarse-grained potential, which is used to perform mesoscale simulations that are directly compared with atomistic-level simulations of the same system. We test our coarse-graining formalism by comparing structural correlations, isothermal compressibility, equation of state, Helmholtz and Gibbs free energies, and potential energy and entropy using both united atom and coarse-grained descriptions. We find quantitative agreement between the analytical formalism for the thermodynamic properties, and the results of Molecular Dynamics simulations, independent of the chosen level of representation. In the mesoscale description, the potential energy of the soft-particle interaction becomes a free energy in the coarse-grained coordinates which preserves the excess free energy from an ideal gas across all levels of description. The structural consistency between the united-atom and mesoscale descriptions means the relative entropy between descriptions has been minimized without any variational optimization parameters. The approach is general and applicable to any polymeric system in different thermodynamic conditions.

  12. Entropy Analysis in Mixed Convection MHD flow of Nanofluid over a Non-linear Stretching Sheet

    NASA Astrophysics Data System (ADS)

    Matin, Meisam Habibi; Nobari, Mohammad Reza Heirani; Jahangiri, Pouyan

    This article deals with a numerical study of entropy analysis in mixed convection MHD flow of nanofluid over a non-linear stretching sheet taking into account the effects of viscous dissipation and variable magnetic field. The nanofluid is made of such nano particles as SiO2 with pure water as a base fluid. To analyze the problem, at first the boundary layer equations are transformed into non-linear ordinary equations using a similarity transformation. The resultant equations are then solved numerically using the Keller-Box scheme based on the implicit finite-difference method. The effects of different non-dimensional governing parameters such as magnetic parameter, nanoparticles volume fraction, Nusselt, Richardson, Eckert, Hartman, Brinkman, Reynolds and entropy generation numbers are investigated in details. The results indicate that increasing the nano particles to the base fluids causes the reduction in shear forces and a decrease in stretching sheet heat transfer coefficient. Also, decreasing the magnetic parameter and increasing the Eckert number result in improves heat transfer rate. Furthermore, the surface acts as a strong source of irreversibility due to the higher entropy generation number near the surface.

  13. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  14. Multiscale Symbolic Phase Transfer Entropy in Financial Time Series Classification

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    We address the challenge of classifying financial time series via a newly proposed multiscale symbolic phase transfer entropy (MSPTE). Using MSPTE method, we succeed to quantify the strength and direction of information flow between financial systems and classify financial time series, which are the stock indices from Europe, America and China during the period from 2006 to 2016 and the stocks of banking, aviation industry and pharmacy during the period from 2007 to 2016, simultaneously. The MSPTE analysis shows that the value of symbolic phase transfer entropy (SPTE) among stocks decreases with the increasing scale factor. It is demonstrated that MSPTE method can well divide stocks into groups by areas and industries. In addition, it can be concluded that the MSPTE analysis quantify the similarity among the stock markets. The symbolic phase transfer entropy (SPTE) between the two stocks from the same area is far less than the SPTE between stocks from different areas. The results also indicate that four stocks from America and Europe have relatively high degree of similarity and the stocks of banking and pharmaceutical industry have higher similarity for CA. It is worth mentioning that the pharmaceutical industry has weaker particular market mechanism than banking and aviation industry.

  15. Entanglement entropy in Galilean conformal field theories and flat holography.

    PubMed

    Bagchi, Arjun; Basu, Rudranil; Grumiller, Daniel; Riegler, Max

    2015-03-20

    We present the analytical calculation of entanglement entropy for a class of two-dimensional field theories governed by the symmetries of the Galilean conformal algebra, thus providing a rare example of such an exact computation. These field theories are the putative holographic duals to theories of gravity in three-dimensional asymptotically flat spacetimes. We provide a check of our field theory answers by an analysis of geodesics. We also exploit the Chern-Simons formulation of three-dimensional gravity and adapt recent proposals of calculating entanglement entropy by Wilson lines in this context to find an independent confirmation of our results from holography.

  16. Entropy bounds, acceleration radiation, and the generalized second law

    NASA Astrophysics Data System (ADS)

    Unruh, William G.; Wald, Robert M.

    1983-05-01

    We calculate the net change in generalized entropy occurring when one attempts to empty the contents of a thin box into a black hole in the manner proposed recently by Bekenstein. The case of a "thick" box also is treated. It is shown that, as in our previous analysis, the effects of acceleration radiation prevent a violation of the generalized second law of thermodynamics. Thus, in this example, the validity of the generalized second law is shown to rest only on the validity of the ordinary second law and the existence of acceleration radiation. No additional assumptions concerning entropy bounds on the contents of the box need to be made.

  17. Entropy in sound and vibration: towards a new paradigm.

    PubMed

    Le Bot, A

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.

  18. Measuring the uncertainty of coupling

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  19. Comparison of hemodynamic effects of intravenous etomidate versus propofol during induction and intubation using entropy guided hypnosis levels.

    PubMed

    Shah, Shagun Bhatia; Chowdhury, Itee; Bhargava, Ajay Kumar; Sabbharwal, Bhawnish

    2015-01-01

    This study aimed to compare the hemodynamic responses during induction and intubation between propofol and etomidate using entropy guided hypnosis. Sixty ASA I & II patients in the age group 20-60 yrs, scheduled for modified radical mastectomy were randomly allocated in two groups based on induction agent Etomidate or Propofol. Both groups received intravenous midazolam 0.03 mg kg(-1) and fentanyl 2 μg kg(-1) as premedication. After induction with the desired agent titrated to entropy 40, vecuronium 0.1 mg kg(-1) was administered for neuromuscular blockade. Heart rate, systolic, diastolic and mean arterial pressures, response entropy [RE] and state entropy [SE] were recorded at baseline, induction and upto three minutes post intubation. Data was subject to statistical analysis SPSS (version 12.0) the paired and the unpaired Student's T-tests for equality of means. Etomidate provided hemodynamic stability without the requirement of any rescue drug in 96.6% patients whereas rescue drug ephedrine was required in 36.6% patients in propofol group. Reduced induction doses 0.15mg kg(-1) for etomidate and 0.98 mg kg(-1) for propofol, sufficed to give an adequate anaesthetic depth based on entropy. Etomidate provides more hemodynamic stability than propofol during induction and intubation. Reduced induction doses of etomidate and propofol titrated to entropy translated into increased hemodynamic stability for both drugs and sufficed to give an adequate anaesthetic depth.

  20. Comparison of hemodynamic effects of intravenous etomidate versus propofol during induction and intubation using entropy guided hypnosis levels

    PubMed Central

    Shah, Shagun Bhatia; Chowdhury, Itee; Bhargava, Ajay Kumar; Sabbharwal, Bhawnish

    2015-01-01

    Background and Aims: This study aimed to compare the hemodynamic responses during induction and intubation between propofol and etomidate using entropy guided hypnosis. Material and Methods: Sixty ASA I & II patients in the age group 20-60 yrs, scheduled for modified radical mastectomy were randomly allocated in two groups based on induction agent Etomidate or Propofol. Both groups received intravenous midazolam 0.03 mg kg-1 and fentanyl 2 μg kg-1 as premedication. After induction with the desired agent titrated to entropy 40, vecuronium 0.1 mg kg-1 was administered for neuromuscular blockade. Heart rate, systolic, diastolic and mean arterial pressures, response entropy [RE] and state entropy [SE] were recorded at baseline, induction and upto three minutes post intubation. Data was subject to statistical analysis SPSS (version 12.0) the paired and the unpaired Student's T-tests for equality of means. Results: Etomidate provided hemodynamic stability without the requirement of any rescue drug in 96.6% patients whereas rescue drug ephedrine was required in 36.6% patients in propofol group. Reduced induction doses 0.15mg kg-1 for etomidate and 0.98 mg kg-1 for propofol, sufficed to give an adequate anaesthetic depth based on entropy. Conclusion: Etomidate provides more hemodynamic stability than propofol during induction and intubation. Reduced induction doses of etomidate and propofol titrated to entropy translated into increased hemodynamic stability for both drugs and sufficed to give an adequate anaesthetic depth. PMID:25948897

  1. Identification of breathing cracks in a beam structure with entropy

    NASA Astrophysics Data System (ADS)

    Wimarshana, Buddhi; Wu, Nan; Wu, Christine

    2016-04-01

    A cantilever beam with a breathing crack is studied to detect and evaluate the crack using entropy measures. Closed cracks in engineering structures lead to proportional complexities to their vibration responses due to weak bi-linearity imposed by the crack breathing phenomenon. Entropy is a measure of system complexity and has the potential in quantifying the complexity. The weak bi-linearity in vibration signals can be amplified using wavelet transformation to increase the sensitivity of the measurements. A mathematical model of harmonically excited unit length steel cantilever beam with a breathing crack located near the fixed end is established, and an iterative numerical method is applied to generate accurate time domain dynamic responses. The bi-linearity in time domain signals due to the crack breathing are amplified by wavelet transformation first, and then the complexities due to bi-linearity is quantified using sample entropy to detect the possible crack and estimate the crack depth. It is observed that the method is capable of identifying crack depths even at very early stages of 3% with the increase in the entropy values more than 10% compared with the healthy beam. The current study extends the entropy based damage detection of rotary machines to structural analysis and takes a step further in high-sensitivity structural health monitoring by combining wavelet transformation with entropy calculations. The proposed technique can also be applied to other types of structures, such as plates and shells.

  2. Multiscale entropy analysis of heart rate variability in heart failure, hypertensive, and sinoaortic-denervated rats: classical and refined approaches.

    PubMed

    Silva, Luiz Eduardo Virgilio; Lataro, Renata Maria; Castania, Jaci Airton; da Silva, Carlos Alberto Aguiar; Valencia, Jose Fernando; Murta, Luiz Otavio; Salgado, Helio Cesar; Fazan, Rubens; Porta, Alberto

    2016-07-01

    The analysis of heart rate variability (HRV) by nonlinear methods has been gaining increasing interest due to their ability to quantify the complexity of cardiovascular regulation. In this study, multiscale entropy (MSE) and refined MSE (RMSE) were applied to track the complexity of HRV as a function of time scale in three pathological conscious animal models: rats with heart failure (HF), spontaneously hypertensive rats (SHR), and rats with sinoaortic denervation (SAD). Results showed that HF did not change HRV complexity, although there was a tendency to decrease the entropy in HF animals. On the other hand, SHR group was characterized by reduced complexity at long time scales, whereas SAD animals exhibited a smaller short- and long-term irregularity. We propose that short time scales (1 to 4), accounting for fast oscillations, are more related to vagal and respiratory control, whereas long time scales (5 to 20), accounting for slow oscillations, are more related to sympathetic control. The increased sympathetic modulation is probably the main reason for the lower entropy observed at high scales for both SHR and SAD groups, acting as a negative factor for the cardiovascular complexity. This study highlights the contribution of the multiscale complexity analysis of HRV for understanding the physiological mechanisms involved in cardiovascular regulation. Copyright © 2016 the American Physiological Society.

  3. Information theory analysis of Australian humpback whale song.

    PubMed

    Miksis-Olds, Jennifer L; Buck, John R; Noad, Michael J; Cato, Douglas H; Stokes, M Dale

    2008-10-01

    Songs produced by migrating whales were recorded off the coast of Queensland, Australia, over six consecutive weeks in 2003. Forty-eight independent song sessions were analyzed using information theory techniques. The average length of the songs estimated by correlation analysis was approximately 100 units, with song sessions lasting from 300 to over 3100 units. Song entropy, a measure of structural constraints, was estimated using three different methodologies: (1) the independently identically distributed model, (2) a first-order Markov model, and (3) the nonparametric sliding window match length (SWML) method, as described by Suzuki et al. [(2006). "Information entropy of humpback whale song," J. Acoust. Soc. Am. 119, 1849-1866]. The analysis finds that the song sequences of migrating Australian whales are consistent with the hierarchical structure proposed by Payne and McVay [(1971). "Songs of humpback whales," Science 173, 587-597], and recently supported mathematically by Suzuki et al. (2006) for singers on the Hawaiian breeding grounds. Both the SWML entropy estimates and the song lengths for the Australian singers in 2003 were lower than that reported by Suzuki et al. (2006) for Hawaiian whales in 1976-1978; however, song redundancy did not differ between these two populations separated spatially and temporally. The average total information in the sequence of units in Australian song was approximately 35 bits/song. Aberrant songs (8%) yielded entropies similar to the typical songs.

  4. Saccadic entropy of head impulses in acute unilateral vestibular loss.

    PubMed

    Hsieh, Li-Chun; Lin, Hung-Ching; Lee, Guo-She

    2017-10-01

    To evaluate the complexity of vestibular-ocular reflex (VOR) in patients with acute unilateral vestibular loss (AUVL) via entropy analysis of head impulses. Horizontal head impulse test (HIT) with high-velocity alternating directions was used to evaluate 12 participants with AUVL and 16 healthy volunteers. Wireless electro-oculography and electronic gyrometry were used to acquire eye positional signals and head velocity signals. The eye velocity signals were then obtained through differentiation, band-pass filtering. The approximate entropy of eye velocity to head velocity (R ApEn ) was used to evaluate chaos property. VOR gain, gain asymmetry ratio, and R ApEn asymmetry ratio were also used to compare the groups. For the lesion-side HIT of the patient group, the mean VOR gain was significantly lower and the mean R ApEn was significantly greater compared with both nonlesion-side HIT and healthy controls (p < 0.01, one-way analysis of variance). Both the R ApEn asymmetry ratio and gain asymmetry ratio of the AUVL group were significantly greater compared with those of the control group (p < 0.05, independent sample t test). Entropy and gain analysis of HIT using wireless electro-oculography system could be used to detect the VOR dysfunctions of AUVL and may become effective methods for evaluating vestibular disorders. Copyright © 2017. Published by Elsevier B.V.

  5. The technique of entropy optimization in motor current signature analysis and its application in the fault diagnosis of gear transmission

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoguang; Liang, Lin; Liu, Fei; Xu, Guanghua; Luo, Ailing; Zhang, Sicong

    2012-05-01

    Nowadays, Motor Current Signature Analysis (MCSA) is widely used in the fault diagnosis and condition monitoring of machine tools. However, although the current signal has lower SNR (Signal Noise Ratio), it is difficult to identify the feature frequencies of machine tools from complex current spectrum that the feature frequencies are often dense and overlapping by traditional signal processing method such as FFT transformation. With the study in the Motor Current Signature Analysis (MCSA), it is found that the entropy is of importance for frequency identification, which is associated with the probability distribution of any random variable. Therefore, it plays an important role in the signal processing. In order to solve the problem that the feature frequencies are difficult to be identified, an entropy optimization technique based on motor current signal is presented in this paper for extracting the typical feature frequencies of machine tools which can effectively suppress the disturbances. Some simulated current signals were made by MATLAB, and a current signal was obtained from a complex gearbox of an iron works made in Luxembourg. In diagnosis the MCSA is combined with entropy optimization. Both simulated and experimental results show that this technique is efficient, accurate and reliable enough to extract the feature frequencies of current signal, which provides a new strategy for the fault diagnosis and the condition monitoring of machine tools.

  6. Surge of Bering Glacier and Bagley Ice Field: Parameterization of surge characteristics based on automated analysis of crevasse image data and laser altimeter data

    NASA Astrophysics Data System (ADS)

    Stachura, M.; Herzfeld, U. C.; McDonald, B.; Weltman, A.; Hale, G.; Trantow, T.

    2012-12-01

    The dynamical processes that occur during the surge of a large, complex glacier system are far from being understood. The aim of this paper is to derive a parameterization of surge characteristics that captures the principle processes and can serve as the basis for a dynamic surge model. Innovative mathematical methods are introduced that facilitate derivation of such a parameterization from remote-sensing observations. Methods include automated geostatistical characterization and connectionist-geostatistical classification of dynamic provinces and deformation states, using the vehicle of crevasse patterns. These methods are applied to analyze satellite and airborne image and laser altimeter data collected during the current surge of Bering Glacier and Bagley Ice Field, Alaska.

  7. Betatron motion with coupling of horizontal and vertical degrees of freedom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. A. Bogacz; V. A. Lebedev

    2002-11-21

    The Courant-Snyder parameterization of one-dimensional linear betatron motion is generalized to two-dimensional coupled linear motion. To represent the 4 x 4 symplectic transfer matrix the following ten parameters were chosen: four beta-functions, four alpha-functions and two betatron phase advances which have a meaning similar to the Courant-Snyder parameterization. Such a parameterization works equally well for weak and strong coupling and can be useful for analysis of coupled betatron motion in circular accelerators as well as in transfer lines. Similarly, the transfer matrix, the bilinear form describing the phase space ellipsoid and the second order moments are related to the eigen-vectors.more » Corresponding equations can be useful in interpreting tracking results and experimental data.« less

  8. Analysis of natural convection in nanofluid-filled H-shaped cavity by entropy generation and heatline visualization using lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Rahimi, Alireza; Sepehr, Mohammad; Lariche, Milad Janghorban; Mesbah, Mohammad; Kasaeipoor, Abbas; Malekshah, Emad Hasani

    2018-03-01

    The lattice Boltzmann simulation of natural convection in H-shaped cavity filled with nanofluid is performed. The entropy generation analysis and heatline visualization are employed to analyze the considered problem comprehensively. The produced nanofluid is SiO2-TiO2/Water-EG (60:40) hybrid nanofluid, and the thermal conductivity and dynamic viscosity of used nanofluid are measured experimentally. To use the experimental data of thermal conductivity and dynamic viscosity, two sets of correlations based on temperature for six different solid volume fractions of 0.5, 1, 1.5, 2, 2.5 and 3 vol% are derived. The influences of different governing parameters such different aspect ratio, solid volume fractions of nanofluid and Rayleigh numbers on the fluid flow, temperature filed, average/local Nusselt number, total/local entropy generation and heatlines are presented.

  9. Application of exergetic sustainability index to a nano-scale irreversible Brayton cycle operating with ideal Bose and Fermi gasses

    NASA Astrophysics Data System (ADS)

    Açıkkalp, Emin; Caner, Necmettin

    2015-09-01

    In this study, a nano-scale irreversible Brayton cycle operating with quantum gasses including Bose and Fermi gasses is researched. Developments in the nano-technology cause searching the nano-scale machines including thermal systems to be unavoidable. Thermodynamic analysis of a nano-scale irreversible Brayton cycle operating with Bose and Fermi gasses was performed (especially using exergetic sustainability index). In addition, thermodynamic analysis involving classical evaluation parameters such as work output, exergy output, entropy generation, energy and exergy efficiencies were conducted. Results are submitted numerically and finally some useful recommendations were conducted. Some important results are: entropy generation and exergetic sustainability index are affected mostly for Bose gas and power output and exergy output are affected mostly for the Fermi gas by x. At the high temperature conditions, work output and entropy generation have high values comparing with other degeneracy conditions.

  10. Fall risk factors analysis based on sample entropy of plantar kinematic signal during stance phase.

    PubMed

    Shengyun Liang; Huiyu Jia; Zilong Li; Huiqi Li; Xing Gao; Zuchang Ma; Yingnan Ma; Guoru Zhao

    2016-08-01

    Falls are a multi-causal phenomenon with a complex interaction. The aim of our research is to study the effect of multiple variables for potential risk of falls and construct an elderly fall risk assessment model based on demographics data and gait characteristics. A total of 101 subjects, whom belong to Malianwa Street, aged above 50 years old and participated in questionnaire survey. Participants were classified into three groups (high, medium and low risk group) according to the score of elderly fall risk assessment scale. In addition, the data of ground reaction force (GRF) and ground reaction moment (GRM) was record when they performed walking at comfortable state. The demographic variables, sample entropy of GRF and GRM, and impulse difference of bilateral foot were considered as potential explanatory variables of risk assessment model. Firstly, we investigated whether different groups could present difference in every variable. Statistical differences were found for the following variables: age (p=2.28e-05); impulse difference (p=0.02036); sample entropy of GRF in vertical direction (p=0.0144); sample entropy of GRM in anterior-posterior direction (p=0.0387). Finally, the multiple regression analysis results indicated that age, impulse difference and sample entropy of resultant GRM could identify individuals who had different levels of fall risk. Therefore, those results could potentially be useful in the fall risk assessment and monitor the state of physical function in elderly population.

  11. Analysis of the anomalous mean-field like properties of Gaussian core model in terms of entropy

    NASA Astrophysics Data System (ADS)

    Nandi, Manoj Kumar; Maitra Bhattacharyya, Sarika

    2018-01-01

    Studies of the Gaussian core model (GCM) have shown that it behaves like a mean-field model and the properties are quite different from standard glass former. In this work, we investigate the entropies, namely, the excess entropy (Sex) and the configurational entropy (Sc) and their different components to address these anomalies. Our study corroborates most of the earlier observations and also sheds new light on the high and low temperature dynamics. We find that unlike in standard glass former where high temperature dynamics is dominated by two-body correlation and low temperature by many-body correlations, in the GCM both high and low temperature dynamics are dominated by many-body correlations. We also find that the many-body entropy which is usually positive at low temperatures and is associated with activated dynamics is negative in the GCM suggesting suppression of activation. Interestingly despite the suppression of activation, the Adam-Gibbs (AG) relation that describes activated dynamics holds in the GCM, thus suggesting a non-activated contribution in AG relation. We also find an overlap between the AG relation and mode coupling power law regime leading to a power law behavior of Sc. From our analysis of this power law behavior, we predict that in the GCM the high temperature dynamics will disappear at dynamical transition temperature and below that there will be a transition to the activated regime. Our study further reveals that the activated regime in the GCM is quite narrow.

  12. Stacking fault energies of face-centered cubic concentrated solid solution alloys

    DOE PAGES

    Zhao, Shijun; Stocks, G. Malcolm; Zhang, Yanwen

    2017-06-22

    We report the stacking fault energy (SFE) for a series of face-centered cubic (fcc) equiatomic concentrated solid solution alloys (CSAs) derived as subsystems from the NiCoFeCrMn and NiCoFeCrPd high entropy alloys based on ab initio calculations. At low temperatures, these CSAs display very low even negative SFEs, indicating that hexagonal close-pack ( hcp) is more energy favorable than fcc structure. The temperature dependence of SFE for some CSAs is studied. With increasing temperature, a hcp-to- fcc transition is revealed for those CSAs with negative SFEs, which can be attributed to the role of intrinsic vibrational entropy. The analysis of themore » vibrational modes suggests that the vibrational entropy arises from the high frequency states in the hcp structure that originate from local vibrational mode. Furthermore, our results underscore the importance of vibrational entropy in determining the temperature dependence of SFE for CSAs.« less

  13. Spatial-dependence recurrence sample entropy

    NASA Astrophysics Data System (ADS)

    Pham, Tuan D.; Yan, Hong

    2018-03-01

    Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.

  14. A computational study of entropy generation in magnetohydrodynamic flow and heat transfer over an unsteady stretching permeable sheet

    NASA Astrophysics Data System (ADS)

    Saeed Butt, Adnan; Ali, Asif

    2014-01-01

    The present article aims to investigate the entropy effects in magnetohydrodynamic flow and heat transfer over an unsteady permeable stretching surface. The time-dependent partial differential equations are converted into non-linear ordinary differential equations by suitable similarity transformations. The solutions of these equations are computed analytically by the Homotopy Analysis Method (HAM) then solved numerically by the MATLAB built-in routine. Comparison of the obtained results is made with the existing literature under limiting cases to validate our study. The effects of unsteadiness parameter, magnetic field parameter, suction/injection parameter, Prandtl number, group parameter and Reynolds number on flow and heat transfer characteristics are checked and analysed with the aid of graphs and tables. Moreover, the effects of these parameters on entropy generation number and Bejan number are also shown graphically. It is examined that the unsteadiness and presence of magnetic field augments the entropy production.

  15. Stacking fault energies of face-centered cubic concentrated solid solution alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Shijun; Stocks, G. Malcolm; Zhang, Yanwen

    We report the stacking fault energy (SFE) for a series of face-centered cubic (fcc) equiatomic concentrated solid solution alloys (CSAs) derived as subsystems from the NiCoFeCrMn and NiCoFeCrPd high entropy alloys based on ab initio calculations. At low temperatures, these CSAs display very low even negative SFEs, indicating that hexagonal close-pack ( hcp) is more energy favorable than fcc structure. The temperature dependence of SFE for some CSAs is studied. With increasing temperature, a hcp-to- fcc transition is revealed for those CSAs with negative SFEs, which can be attributed to the role of intrinsic vibrational entropy. The analysis of themore » vibrational modes suggests that the vibrational entropy arises from the high frequency states in the hcp structure that originate from local vibrational mode. Furthermore, our results underscore the importance of vibrational entropy in determining the temperature dependence of SFE for CSAs.« less

  16. Entropy and cosmology.

    NASA Astrophysics Data System (ADS)

    Zucker, M. H.

    This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own temperature and thus, by itself; reverse entropy. The vast encompassing gravitational forces that the universe has at its disposal, forces that dominate the phase of contraction, provide the compacting, compressive mechanism that regenerates heat in an expanded, cooled universe and decreases entropy. And this phenomenon takes place without diminishing or depleting the finite amount of mass/energy with which the universe began. The fact that the universe can reverse the entropic process leads to possibilities previously ignored when assessing which of the three models (open, closed, of flat) most probably represents the future of the universe. After analyzing the models, the conclusion reached here is that the open model is only an expanded version of the closed model and therefore is not open, and the closed model will never collapse to a big crunch and, therefore, is not closed. Which leaves a modified model, oscillating forever between limited phases of expansion and contraction (a universe in "dynamic equilibrium") as the only feasible choice.

  17. Entropy of balance - some recent results

    PubMed Central

    2010-01-01

    Background Entropy when applied to biological signals is expected to reflect the state of the biological system. However the physiological interpretation of the entropy is not always straightforward. When should high entropy be interpreted as a healthy sign, and when as marker of deteriorating health? We address this question for the particular case of human standing balance and the Center of Pressure data. Methods We have measured and analyzed balance data of 136 participants (young, n = 45; elderly, n = 91) comprising in all 1085 trials, and calculated the Sample Entropy (SampEn) for medio-lateral (M/L) and anterior-posterior (A/P) Center of Pressure (COP) together with the Hurst self-similariy (ss) exponent α using Detrended Fluctuation Analysis (DFA). The COP was measured with a force plate in eight 30 seconds trials with eyes closed, eyes open, foam, self-perturbation and nudge conditions. Results 1) There is a significant difference in SampEn for the A/P-direction between the elderly and the younger groups Old > young. 2) For the elderly we have in general A/P > M/L. 3) For the younger group there was no significant A/P-M/L difference with the exception for the nudge trials where we had the reverse situation, A/P < M/L. 4) For the elderly we have, Eyes Closed > Eyes Open. 5) In case of the Hurst ss-exponent we have for the elderly, M/L > A/P. Conclusions These results seem to be require some modifications of the more or less established attention-constraint interpretation of entropy. This holds that higher entropy correlates with a more automatic and a less constrained mode of balance control, and that a higher entropy reflects, in this sense, a more efficient balancing. PMID:20670457

  18. Thermodynamics of organisms in the context of dynamic energy budget theory.

    PubMed

    Sousa, Tânia; Mota, Rui; Domingos, Tiago; Kooijman, S A L M

    2006-11-01

    We carry out a thermodynamic analysis to an organism. It is applicable to any type of organism because (1) it is based on a thermodynamic formalism applicable to all open thermodynamic systems and (2) uses a general model to describe the internal structure of the organism--the dynamic energy budget (DEB) model. Our results on the thermodynamics of DEB organisms are the following. (1) Thermodynamic constraints for the following types of organisms: (a) aerobic and exothermic, (b) anaerobic and exothermic, and (c) anaerobic and endothermic; showing that anaerobic organisms have a higher thermodynamic flexibility. (2) A way to compute the changes in the enthalpy and in the entropy of living biomass that accompany changes in growth rate solving the problem of evaluating the thermodynamic properties of biomass as a function of the amount of reserves. (3) Two expressions for Thornton's coefficient that explain its experimental variability and theoretically underpin its use in metabolic studies. (4) A mechanism that organisms in non-steady-state use to rid themselves of internal entropy production: "dilution of entropy production by growth." To demonstrate the practical applicability of DEB theory to quantify thermodynamic changes in organisms we use published data on Klebsiella aerogenes growing aerobically in a continuous culture. We obtain different values for molar entropies of the reserve and the structure of Klebsiella aerogenes proving that the reserve density concept of DEB theory is essential in discussions concerning (a) the relationship between organization and entropy and (b) the mechanism of storing entropy in new biomass. Additionally, our results suggest that the entropy of dead biomass is significantly different from the entropy of living biomass.

  19. Improving Parameterization of Entrainment Rate for Shallow Convection with Aircraft Measurements and Large-Eddy Simulation

    DOE PAGES

    Lu, Chunsong; Liu, Yangang; Zhang, Guang J.; ...

    2016-02-01

    This work examines the relationships of entrainment rate to vertical velocity, buoyancy, and turbulent dissipation rate by applying stepwise principal component regression to observational data from shallow cumulus clouds collected during the Routine AAF [Atmospheric Radiation Measurement (ARM) Aerial Facility] Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations (RACORO) field campaign over the ARM Southern Great Plains (SGP) site near Lamont, Oklahoma. The cumulus clouds during the RACORO campaign simulated using a large eddy simulation (LES) model are also examined with the same approach. The analysis shows that a combination of multiple variables can better represent entrainment ratemore » in both the observations and LES than any single-variable fitting. Three commonly used parameterizations are also tested on the individual cloud scale. A new parameterization is therefore presented that relates entrainment rate to vertical velocity, buoyancy and dissipation rate; the effects of treating clouds as ensembles and humid shells surrounding cumulus clouds on the new parameterization are discussed. Physical mechanisms underlying the relationships of entrainment rate to vertical velocity, buoyancy and dissipation rate are also explored.« less

  20. Cloud Microphysics Parameterization in a Shallow Cumulus Cloud Simulated by a Largrangian Cloud Model

    NASA Astrophysics Data System (ADS)

    Oh, D.; Noh, Y.; Hoffmann, F.; Raasch, S.

    2017-12-01

    Lagrangian cloud model (LCM) is a fundamentally new approach of cloud simulation, in which the flow field is simulated by large eddy simulation and droplets are treated as Lagrangian particles undergoing cloud microphysics. LCM enables us to investigate raindrop formation and examine the parameterization of cloud microphysics directly by tracking the history of individual Lagrangian droplets simulated by LCM. Analysis of the magnitude of raindrop formation and the background physical conditions at the moment at which every Lagrangian droplet grows from cloud droplets to raindrops in a shallow cumulus cloud reveals how and under which condition raindrops are formed. It also provides information how autoconversion and accretion appear and evolve within a cloud, and how they are affected by various factors such as cloud water mixing ratio, rain water mixing ratio, aerosol concentration, drop size distribution, and dissipation rate. Based on these results, the parameterizations of autoconversion and accretion, such as Kessler (1969), Tripoli and Cotton (1980), Beheng (1994), and Kharioutdonov and Kogan (2000), are examined, and the modifications to improve the parameterizations are proposed.

  1. Amplification of intrinsic emittance due to rough metal cathodes: Formulation of a parameterization model

    NASA Astrophysics Data System (ADS)

    Charles, T. K.; Paganin, D. M.; Dowd, R. T.

    2016-08-01

    Intrinsic emittance is often the limiting factor for brightness in fourth generation light sources and as such, a good understanding of the factors affecting intrinsic emittance is essential in order to be able to decrease it. Here we present a parameterization model describing the proportional increase in emittance induced by cathode surface roughness. One major benefit behind the parameterization approach presented here is that it takes the complexity of a Monte Carlo model and reduces the results to a straight-forward empirical model. The resulting models describe the proportional increase in transverse momentum introduced by surface roughness, and are applicable to various metal types, photon wavelengths, applied electric fields, and cathode surface terrains. The analysis includes the increase in emittance due to changes in the electric field induced by roughness as well as the increase in transverse momentum resultant from the spatially varying surface normal. We also compare the results of the Parameterization Model to an Analytical Model which employs various approximations to produce a more compact expression with the cost of a reduction in accuracy.

  2. A satellite observation test bed for cloud parameterization development

    NASA Astrophysics Data System (ADS)

    Lebsock, M. D.; Suselj, K.

    2015-12-01

    We present an observational test-bed of cloud and precipitation properties derived from CloudSat, CALIPSO, and the the A-Train. The focus of the test-bed is on marine boundary layer clouds including stratocumulus and cumulus and the transition between these cloud regimes. Test-bed properties include the cloud cover and three dimensional cloud fraction along with the cloud water path and precipitation water content, and associated radiative fluxes. We also include the subgrid scale distribution of cloud and precipitation, and radiaitive quantities, which must be diagnosed by a model parameterization. The test-bed further includes meterological variables from the Modern Era Retrospective-analysis for Research and Applications (MERRA). MERRA variables provide the initialization and forcing datasets to run a parameterization in Single Column Model (SCM) mode. We show comparisons of an Eddy-Diffusivity/Mass-FLux (EDMF) parameterization coupled to micorphsycis and macrophysics packages run in SCM mode with observed clouds. Comparsions are performed regionally in areas of climatological subsidence as well stratified by dynamical and thermodynamical variables. Comparisons demonstrate the ability of the EDMF model to capture the observed transitions between subtropical stratocumulus and cumulus cloud regimes.

  3. Dynamically consistent parameterization of mesoscale eddies. Part III: Deterministic approach

    NASA Astrophysics Data System (ADS)

    Berloff, Pavel

    2018-07-01

    This work continues development of dynamically consistent parameterizations for representing mesoscale eddy effects in non-eddy-resolving and eddy-permitting ocean circulation models and focuses on the classical double-gyre problem, in which the main dynamic eddy effects maintain eastward jet extension of the western boundary currents and its adjacent recirculation zones via eddy backscatter mechanism. Despite its fundamental importance, this mechanism remains poorly understood, and in this paper we, first, study it and, then, propose and test its novel parameterization. We start by decomposing the reference eddy-resolving flow solution into the large-scale and eddy components defined by spatial filtering, rather than by the Reynolds decomposition. Next, we find that the eastward jet and its recirculations are robustly present not only in the large-scale flow itself, but also in the rectified time-mean eddies, and in the transient rectified eddy component, which consists of highly anisotropic ribbons of the opposite-sign potential vorticity anomalies straddling the instantaneous eastward jet core and being responsible for its continuous amplification. The transient rectified component is separated from the flow by a novel remapping method. We hypothesize that the above three components of the eastward jet are ultimately driven by the small-scale transient eddy forcing via the eddy backscatter mechanism, rather than by the mean eddy forcing and large-scale nonlinearities. We verify this hypothesis by progressively turning down the backscatter and observing the induced flow anomalies. The backscatter analysis leads us to formulating the key eddy parameterization hypothesis: in an eddy-permitting model at least partially resolved eddy backscatter can be significantly amplified to improve the flow solution. Such amplification is a simple and novel eddy parameterization framework implemented here in terms of local, deterministic flow roughening controlled by single parameter. We test the parameterization skills in an hierarchy of non-eddy-resolving and eddy-permitting modifications of the original model and demonstrate, that indeed it can be highly efficient for restoring the eastward jet extension and its adjacent recirculation zones. The new deterministic parameterization framework not only combines remarkable simplicity with good performance but also is dynamically transparent, therefore, it provides a powerful alternative to the common eddy diffusion and emerging stochastic parameterizations.

  4. Elementary exact calculations of degree growth and entropy for discrete equations.

    PubMed

    Halburd, R G

    2017-05-01

    Second-order discrete equations are studied over the field of rational functions [Formula: see text], where z is a variable not appearing in the equation. The exact degree of each iterate as a function of z can be calculated easily using the standard calculations that arise in singularity confinement analysis, even when the singularities are not confined. This produces elementary yet rigorous entropy calculations.

  5. Application of the new Cross Recurrence Plots to multivariate data

    NASA Astrophysics Data System (ADS)

    Thiel, M.; Romano, C.; Kurths, J.

    2003-04-01

    We extend and then apply the method of the new Cross Recurrence Plots (XRPs) to multivariate data. After introducing the new method we carry out an analysis of spatiotemporal ecological data. We compute not only the Rényi entropies and cross entropies by XRP, that allow to draw conclusions about the coupling of the systems, but also find a prediction horizon for intermediate time scales.

  6. Heart rate variability analysis based on time-frequency representation and entropies in hypertrophic cardiomyopathy patients.

    PubMed

    Clariá, F; Vallverdú, M; Baranowski, R; Chojnowska, L; Caminal, P

    2008-03-01

    In hypertrophic cardiomyopathy (HCM) patients there is an increased risk of premature death, which can occur with little or no warning. Furthermore, classification for sudden cardiac death on patients with HCM is very difficult. The aim of our study was to improve the prognostic value of heart rate variability (HRV) in HCM patients, giving insight into changes of the autonomic nervous system. In this way, the suitability of linear and nonlinear measures was studied to assess the HRV. These measures were based on time-frequency representation (TFR) and on Shannon and Rényi entropies, and compared with traditional HRV measures. Holter recordings of 64 patients with HCM and 55 healthy subjects were analyzed. The HCM patients consisted of two groups: 13 high risk patients, after aborted sudden cardiac death (SCD); 51 low risk patients, without SCD. Five-hour RR signals, corresponding to the sleep period of the subjects, were considered for the analysis as a comparable standard situation. These RR signals were filtered in the three frequency bands: very low frequency band (VLF, 0-0.04 Hz), low frequency band (LF, 0.04-0.15 Hz) and high frequency band (HF, 0.15-0.45 Hz). TFR variables based on instantaneous frequency and energy functions were able to classify HCM patients and healthy subjects (control group). Results revealed that measures obtained from TFR analysis of the HRV better classified the groups of subjects than traditional HRV parameters. However, results showed that nonlinear measures improved group classification. It was observed that entropies calculated in the HF band showed the highest statistically significant levels comparing the HCM group and the control group, p-value < 0.0005. The values of entropy measures calculated in the HCM group presented lower values, indicating a decreasing of complexity, than those calculated from the control group. Moreover, similar behavior was observed comparing high and low risk of premature death, the values of the entropy being lower in high risk patients, p-value < 0.05, indicating an increase of predictability. Furthermore, measures from information entropy, but not from TFR, seem to be useful for enhanced risk stratification in HCM patients with an increased risk of sudden cardiac death.

  7. Multiscale permutation entropy analysis of EEG recordings during sevoflurane anesthesia

    NASA Astrophysics Data System (ADS)

    Li, Duan; Li, Xiaoli; Liang, Zhenhu; Voss, Logan J.; Sleigh, Jamie W.

    2010-08-01

    Electroencephalogram (EEG) monitoring of the effect of anesthetic drugs on the central nervous system has long been used in anesthesia research. Several methods based on nonlinear dynamics, such as permutation entropy (PE), have been proposed to analyze EEG series during anesthesia. However, these measures are still single-scale based and may not completely describe the dynamical characteristics of complex EEG series. In this paper, a novel measure combining multiscale PE information, called CMSPE (composite multi-scale permutation entropy), was proposed for quantifying the anesthetic drug effect on EEG recordings during sevoflurane anesthesia. Three sets of simulated EEG series during awake, light and deep anesthesia were used to select the parameters for the multiscale PE analysis: embedding dimension m, lag τ and scales to be integrated into the CMSPE index. Then, the CMSPE index and raw single-scale PE index were applied to EEG recordings from 18 patients who received sevoflurane anesthesia. Pharmacokinetic/pharmacodynamic (PKPD) modeling was used to relate the measured EEG indices and the anesthetic drug concentration. Prediction probability (Pk) statistics and correlation analysis with the response entropy (RE) index, derived from the spectral entropy (M-entropy module; GE Healthcare, Helsinki, Finland), were investigated to evaluate the effectiveness of the new proposed measure. It was found that raw single-scale PE was blind to subtle transitions between light and deep anesthesia, while the CMSPE index tracked these changes accurately. Around the time of loss of consciousness, CMSPE responded significantly more rapidly than the raw PE, with the absolute slopes of linearly fitted response versus time plots of 0.12 (0.09-0.15) and 0.10 (0.06-0.13), respectively. The prediction probability Pk of 0.86 (0.85-0.88) and 0.85 (0.80-0.86) for CMSPE and raw PE indicated that the CMSPE index correlated well with the underlying anesthetic effect. The correlation coefficient for the comparison between the CMSPE index and RE index of 0.84 (0.80-0.88) was significantly higher than the raw PE index of 0.75 (0.66-0.84). The results show that the CMSPE outperforms the raw single-scale PE in reflecting the sevoflurane drug effect on the central nervous system.

  8. Development of reactive force fields using ab initio molecular dynamics simulation minimally biased to experimental data

    NASA Astrophysics Data System (ADS)

    Chen, Chen; Arntsen, Christopher; Voth, Gregory A.

    2017-10-01

    Incorporation of quantum mechanical electronic structure data is necessary to properly capture the physics of many chemical processes. Proton hopping in water, which involves rearrangement of chemical and hydrogen bonds, is one such example of an inherently quantum mechanical process. Standard ab initio molecular dynamics (AIMD) methods, however, do not yet accurately predict the structure of water and are therefore less than optimal for developing force fields. We have instead utilized a recently developed method which minimally biases AIMD simulations to match limited experimental data to develop novel multiscale reactive molecular dynamics (MS-RMD) force fields by using relative entropy minimization. In this paper, we present two new MS-RMD models using such a parameterization: one which employs water with harmonic internal vibrations and another which uses anharmonic water. We show that the newly developed MS-RMD models very closely reproduce the solvation structure of the hydrated excess proton in the target AIMD data. We also find that the use of anharmonic water increases proton hopping, thereby increasing the proton diffusion constant.

  9. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  10. Betatron motion with coupling of horizontal and vertical degrees of freedom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lebedev, V.A.; /Fermilab; Bogacz, S.A.

    Presently, there are two most frequently used parameterizations of linear x-y coupled motion used in the accelerator physics. They are the Edwards-Teng and Mais-Ripken parameterizations. The article is devoted to an analysis of close relationship between the two representations, thus adding a clarity to their physical meaning. It also discusses the relationship between the eigen-vectors, the beta-functions, second order moments and the bilinear form representing the particle ellipsoid in the 4D phase space. Then, it consideres a further development of Mais-Ripken parameteresation where the particle motion is described by 10 parameters: four beta-functions, four alpha-functions and two betatron phase advances.more » In comparison with Edwards-Teng parameterization the chosen parametrization has an advantage that it works equally well for analysis of coupled betatron motion in circular accelerators and in transfer lines. Considered relationship between second order moments, eigen-vectors and beta-functions can be useful in interpreting tracking results and experimental data. As an example, the developed formalizm is applied to the FNAL electron cooler and Derbenev's vertex-to-plane adapter.« less

  11. Classification of mathematics deficiency using shape and scale analysis of 3D brain structures

    NASA Astrophysics Data System (ADS)

    Kurtek, Sebastian; Klassen, Eric; Gore, John C.; Ding, Zhaohua; Srivastava, Anuj

    2011-03-01

    We investigate the use of a recent technique for shape analysis of brain substructures in identifying learning disabilities in third-grade children. This Riemannian technique provides a quantification of differences in shapes of parameterized surfaces, using a distance that is invariant to rigid motions and re-parameterizations. Additionally, it provides an optimal registration across surfaces for improved matching and comparisons. We utilize an efficient gradient based method to obtain the optimal re-parameterizations of surfaces. In this study we consider 20 different substructures in the human brain and correlate the differences in their shapes with abnormalities manifested in deficiency of mathematical skills in 106 subjects. The selection of these structures is motivated in part by the past links between their shapes and cognitive skills, albeit in broader contexts. We have studied the use of both individual substructures and multiple structures jointly for disease classification. Using a leave-one-out nearest neighbor classifier, we obtained a 62.3% classification rate based on the shape of the left hippocampus. The use of multiple structures resulted in an improved classification rate of 71.4%.

  12. Parameterization of air temperature in high temporal and spatial resolution from a combination of the SEVIRI and MODIS instruments

    NASA Astrophysics Data System (ADS)

    Zakšek, Klemen; Schroedter-Homscheidt, Marion

    Some applications, e.g. from traffic or energy management, require air temperature data in high spatial and temporal resolution at two metres height above the ground ( T2m), sometimes in near-real-time. Thus, a parameterization based on boundary layer physical principles was developed that determines the air temperature from remote sensing data (SEVIRI data aboard the MSG and MODIS data aboard Terra and Aqua satellites). The method consists of two parts. First, a downscaling procedure from the SEVIRI pixel resolution of several kilometres to a one kilometre spatial resolution is performed using a regression analysis between the land surface temperature ( LST) and the normalized differential vegetation index ( NDVI) acquired by the MODIS instrument. Second, the lapse rate between the LST and T2m is removed using an empirical parameterization that requires albedo, down-welling surface short-wave flux, relief characteristics and NDVI data. The method was successfully tested for Slovenia, the French region Franche-Comté and southern Germany for the period from May to December 2005, indicating that the parameterization is valid for Central Europe. This parameterization results in a root mean square deviation RMSD of 2.0 K during the daytime with a bias of -0.01 K and a correlation coefficient of 0.95. This is promising, especially considering the high temporal (30 min) and spatial resolution (1000 m) of the results.

  13. A General Framework for Thermodynamically Consistent Parameterization and Efficient Sampling of Enzymatic Reactions

    PubMed Central

    Saa, Pedro; Nielsen, Lars K.

    2015-01-01

    Kinetic models provide the means to understand and predict the dynamic behaviour of enzymes upon different perturbations. Despite their obvious advantages, classical parameterizations require large amounts of data to fit their parameters. Particularly, enzymes displaying complex reaction and regulatory (allosteric) mechanisms require a great number of parameters and are therefore often represented by approximate formulae, thereby facilitating the fitting but ignoring many real kinetic behaviours. Here, we show that full exploration of the plausible kinetic space for any enzyme can be achieved using sampling strategies provided a thermodynamically feasible parameterization is used. To this end, we developed a General Reaction Assembly and Sampling Platform (GRASP) capable of consistently parameterizing and sampling accurate kinetic models using minimal reference data. The former integrates the generalized MWC model and the elementary reaction formalism. By formulating the appropriate thermodynamic constraints, our framework enables parameterization of any oligomeric enzyme kinetics without sacrificing complexity or using simplifying assumptions. This thermodynamically safe parameterization relies on the definition of a reference state upon which feasible parameter sets can be efficiently sampled. Uniform sampling of the kinetics space enabled dissecting enzyme catalysis and revealing the impact of thermodynamics on reaction kinetics. Our analysis distinguished three reaction elasticity regions for common biochemical reactions: a steep linear region (0> ΔGr >-2 kJ/mol), a transition region (-2> ΔGr >-20 kJ/mol) and a constant elasticity region (ΔGr <-20 kJ/mol). We also applied this framework to model more complex kinetic behaviours such as the monomeric cooperativity of the mammalian glucokinase and the ultrasensitive response of the phosphoenolpyruvate carboxylase of Escherichia coli. In both cases, our approach described appropriately not only the kinetic behaviour of these enzymes, but it also provided insights about the particular features underpinning the observed kinetics. Overall, this framework will enable systematic parameterization and sampling of enzymatic reactions. PMID:25874556

  14. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    PubMed

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  15. Dynamics of entanglement entropy of interacting fermions in a 1D driven harmonic trap

    NASA Astrophysics Data System (ADS)

    McKenney, Joshua R.; Porter, William J.; Drut, Joaquín E.

    2018-03-01

    Following up on a recent analysis of two cold atoms in a time-dependent harmonic trap in one dimension, we explore the entanglement entropy of two and three fermions in the same situation when driven through a parametric resonance. We find that the presence of such a resonance in the two-particle system leaves a clear imprint on the entanglement entropy. We show how the signal is modified by attractive and repulsive contact interactions, and how it remains present for the three-particle system. Additionaly, we extend the work of recent experiments to demonstrate how restricting observation to a limited subsystem gives rise to locally thermal behavior.

  16. Entropy in sound and vibration: towards a new paradigm

    PubMed Central

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190

  17. On the Structure of {L^∞}-Entropy Solutions to Scalar Conservation Laws in One-Space Dimension

    NASA Astrophysics Data System (ADS)

    Bianchini, S.; Marconi, E.

    2017-10-01

    We prove that if u is the entropy solution to a scalar conservation law in one space dimension, then the entropy dissipation is a measure concentrated on countably many Lipschitz curves. This result is a consequence of a detailed analysis of the structure of the characteristics. In particular, the characteristic curves are segments outside a countably 1-rectifiable set and the left and right traces of the solution exist in a C 0-sense up to the degeneracy due to the segments where {f''=0}. We prove also that the initial data is taken in a suitably strong sense and we give some examples which show that these results are sharp.

  18. Permutation Entropy and Signal Energy Increase the Accuracy of Neuropathic Change Detection in Needle EMG

    PubMed Central

    2018-01-01

    Background and Objective. Needle electromyography can be used to detect the number of changes and morphological changes in motor unit potentials of patients with axonal neuropathy. General mathematical methods of pattern recognition and signal analysis were applied to recognize neuropathic changes. This study validates the possibility of extending and refining turns-amplitude analysis using permutation entropy and signal energy. Methods. In this study, we examined needle electromyography in 40 neuropathic individuals and 40 controls. The number of turns, amplitude between turns, signal energy, and “permutation entropy” were used as features for support vector machine classification. Results. The obtained results proved the superior classification performance of the combinations of all of the above-mentioned features compared to the combinations of fewer features. The lowest accuracy from the tested combinations of features had peak-ratio analysis. Conclusion. Using the combination of permutation entropy with signal energy, number of turns and mean amplitude in SVM classification can be used to refine the diagnosis of polyneuropathies examined by needle electromyography. PMID:29606959

  19. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis

    PubMed Central

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956

  20. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  1. Multiscale entropy-based methods for heart rate variability complexity analysis

    NASA Astrophysics Data System (ADS)

    Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio

    2015-03-01

    Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.

  2. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    PubMed

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  3. Classification enhancement for post-stroke dementia using fuzzy neighborhood preserving analysis with QR-decomposition.

    PubMed

    Al-Qazzaz, Noor Kamal; Ali, Sawal; Ahmad, Siti Anom; Escudero, Javier

    2017-07-01

    The aim of the present study was to discriminate the electroencephalogram (EEG) of 5 patients with vascular dementia (VaD), 15 patients with stroke-related mild cognitive impairment (MCI), and 15 control normal subjects during a working memory (WM) task. We used independent component analysis (ICA) and wavelet transform (WT) as a hybrid preprocessing approach for EEG artifact removal. Three different features were extracted from the cleaned EEG signals: spectral entropy (SpecEn), permutation entropy (PerEn) and Tsallis entropy (TsEn). Two classification schemes were applied - support vector machine (SVM) and k-nearest neighbors (kNN) - with fuzzy neighborhood preserving analysis with QR-decomposition (FNPAQR) as a dimensionality reduction technique. The FNPAQR dimensionality reduction technique increased the SVM classification accuracy from 82.22% to 90.37% and from 82.6% to 86.67% for kNN. These results suggest that FNPAQR consistently improves the discrimination of VaD, MCI patients and control normal subjects and it could be a useful feature selection to help the identification of patients with VaD and MCI.

  4. Numerical study of entropy generation in MHD water-based carbon nanotubes along an inclined permeable surface

    NASA Astrophysics Data System (ADS)

    Soomro, Feroz Ahmed; Rizwan-ul-Haq; Khan, Z. H.; Zhang, Qiang

    2017-10-01

    Main theme of the article is to examine the entropy generation analysis for the magneto-hydrodynamic mixed convection flow of water functionalized carbon nanotubes along an inclined stretching surface. Thermophysical properties of both particles and working fluid are incorporated in the system of governing partial differential equations. Rehabilitation of nonlinear system of equations is obtained via similarity transformations. Moreover, solutions of these equations are further utilized to determine the volumetric entropy and characteristic entropy generation. Solutions of governing boundary layer equations are obtained numerically using the finite difference method. Effects of two types of carbon nanotubes, namely, single-wall carbon nanotubes (SWCNTs) and multi-wall carbon nanotubes (MWCNTs) with water as base fluid have been analyzed over the physical quantities of interest, namely, surface skin friction, heat transfer rate and entropy generation coefficients. Influential results of velocities, temperature, entropy generation and isotherms are plotted against the emerging parameter, namely, nanoparticle fraction 0≤φ ≤ 0.2, thermal convective parameter 0≤ λ ≤ 5, Hartmann number 0≤ M≤ 2, suction/injection parameter -1≤ S≤ 1, and Eckert number 0≤ Ec ≤ 2. It is finally concluded that skin friction increases due to the increase in the magnetic parameter, suction/injection and nanoparticle volume fraction, whereas the Nusselt number shows an increasing trend due to the increase in the suction parameter, mixed convection parameter and nanoparticle volume fraction. Similarly, entropy generation shows an opposite behavior for the Hartmann number and mixed convection parameter for both single-wall and multi-wall carbon nanotubes.

  5. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  6. EEG entropy measures indicate decrease of cortical information processing in Disorders of Consciousness.

    PubMed

    Thul, Alexander; Lechinger, Julia; Donis, Johann; Michitsch, Gabriele; Pichler, Gerald; Kochs, Eberhard F; Jordan, Denis; Ilg, Rüdiger; Schabus, Manuel

    2016-02-01

    Clinical assessments that rely on behavioral responses to differentiate Disorders of Consciousness are at times inapt because of some patients' motor disabilities. To objectify patients' conditions of reduced consciousness the present study evaluated the use of electroencephalography to measure residual brain activity. We analyzed entropy values of 18 scalp EEG channels of 15 severely brain-damaged patients with clinically diagnosed Minimally-Conscious-State (MCS) or Unresponsive-Wakefulness-Syndrome (UWS) and compared the results to a sample of 24 control subjects. Permutation entropy (PeEn) and symbolic transfer entropy (STEn), reflecting information processes in the EEG, were calculated for all subjects. Participants were tested on a modified active own-name paradigm to identify correlates of active instruction following. PeEn showed reduced local information content in the EEG in patients, that was most pronounced in UWS. STEn analysis revealed altered directed information flow in the EEG of patients, indicating impaired feed-backward connectivity. Responses to auditory stimulation yielded differences in entropy measures, indicating reduced information processing in MCS and UWS. Local EEG information content and information flow are affected in Disorders of Consciousness. This suggests local cortical information capacity and feedback information transfer as neural correlates of consciousness. The utilized EEG entropy analyses were able to relate to patient groups with different Disorders of Consciousness. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  7. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks

    PubMed Central

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-01-01

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques. PMID:28383496

  8. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks.

    PubMed

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-04-06

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques.

  9. Thermodynamics of extremal rotating thin shells in an extremal BTZ spacetime and the extremal black hole entropy

    NASA Astrophysics Data System (ADS)

    Lemos, José P. S.; Minamitsuji, Masato; Zaslavskii, Oleg B.

    2017-02-01

    In a (2 +1 )-dimensional spacetime with a negative cosmological constant, the thermodynamics and the entropy of an extremal rotating thin shell, i.e., an extremal rotating ring, are investigated. The outer and inner regions with respect to the shell are taken to be the Bañados-Teitelbom-Zanelli (BTZ) spacetime and the vacuum ground state anti-de Sitter spacetime, respectively. By applying the first law of thermodynamics to the extremal thin shell, one shows that the entropy of the shell is an arbitrary well-behaved function of the gravitational area A+ alone, S =S (A+). When the thin shell approaches its own gravitational radius r+ and turns into an extremal rotating BTZ black hole, it is found that the entropy of the spacetime remains such a function of A+, both when the local temperature of the shell at the gravitational radius is zero and nonzero. It is thus vindicated by this analysis that extremal black holes, here extremal BTZ black holes, have different properties from the corresponding nonextremal black holes, which have a definite entropy, the Bekenstein-Hawking entropy S (A+)=A/+4G , where G is the gravitational constant. It is argued that for extremal black holes, in particular for extremal BTZ black holes, one should set 0 ≤S (A+)≤A/+4G;i.e., the extremal black hole entropy has values in between zero and the maximum Bekenstein-Hawking entropy A/+4 G . Thus, rather than having just two entropies for extremal black holes, as previous results have debated, namely, 0 and A/+4 G , it is shown here that extremal black holes, in particular extremal BTZ black holes, may have a continuous range of entropies, limited by precisely those two entropies. Surely, the entropy that a particular extremal black hole picks must depend on past processes, notably on how it was formed. A remarkable relation between the third law of thermodynamics and the impossibility for a massive body to reach the velocity of light is also found. In addition, in the procedure, it becomes clear that there are two distinct angular velocities for the shell, the mechanical and thermodynamic angular velocities. We comment on the relationship between these two velocities. In passing, we clarify, for a static spacetime with a thermal shell, the meaning of the Tolman temperature formula at a generic radius and at the shell.

  10. Conformational Entropy of Intrinsically Disordered Proteins from Amino Acid Triads

    PubMed Central

    Baruah, Anupaul; Rani, Pooja; Biswas, Parbati

    2015-01-01

    This work quantitatively characterizes intrinsic disorder in proteins in terms of sequence composition and backbone conformational entropy. Analysis of the normalized relative composition of the amino acid triads highlights a distinct boundary between globular and disordered proteins. The conformational entropy is calculated from the dihedral angles of the middle amino acid in the amino acid triad for the conformational ensemble of the globular, partially and completely disordered proteins relative to the non-redundant database. Both Monte Carlo (MC) and Molecular Dynamics (MD) simulations are used to characterize the conformational ensemble of the representative proteins of each group. The results show that the globular proteins span approximately half of the allowed conformational states in the Ramachandran space, while the amino acid triads in disordered proteins sample the entire range of the allowed dihedral angle space following Flory’s isolated-pair hypothesis. Therefore, only the sequence information in terms of the relative amino acid triad composition may be sufficient to predict protein disorder and the backbone conformational entropy, even in the absence of well-defined structure. The predicted entropies are found to agree with those calculated using mutual information expansion and the histogram method. PMID:26138206

  11. The coupling analysis between stock market indices based on permutation measures

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Xia, Jianan; Yeh, Chien-Hung

    2016-04-01

    Many information-theoretic methods have been proposed for analyzing the coupling dependence between time series. And it is significant to quantify the correlation relationship between financial sequences since the financial market is a complex evolved dynamic system. Recently, we developed a new permutation-based entropy, called cross-permutation entropy (CPE), to detect the coupling structures between two synchronous time series. In this paper, we extend the CPE method to weighted cross-permutation entropy (WCPE), to address some of CPE's limitations, mainly its inability to differentiate between distinct patterns of a certain motif and the sensitivity of patterns close to the noise floor. It shows more stable and reliable results than CPE does when applied it to spiky data and AR(1) processes. Besides, we adapt the CPE method to infer the complexity of short-length time series by freely changing the time delay, and test it with Gaussian random series and random walks. The modified method shows the advantages in reducing deviations of entropy estimation compared with the conventional one. Finally, the weighted cross-permutation entropy of eight important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  12. The High Temperature Tensile and Creep Behaviors of High Entropy Superalloy.

    PubMed

    Tsao, Te-Kang; Yeh, An-Chou; Kuo, Chen-Ming; Kakehi, Koji; Murakami, Hideyuki; Yeh, Jien-Wei; Jian, Sheng-Rui

    2017-10-04

    This article presents the high temperature tensile and creep behaviors of a novel high entropy alloy (HEA). The microstructure of this HEA resembles that of advanced superalloys with a high entropy FCC matrix and L1 2 ordered precipitates, so it is also named as "high entropy superalloy (HESA)". The tensile yield strengths of HESA surpass those of the reported HEAs from room temperature to elevated temperatures; furthermore, its creep resistance at 982 °C can be compared to those of some Ni-based superalloys. Analysis on experimental results indicate that HESA could be strengthened by the low stacking-fault energy of the matrix, high anti-phase boundary energy of the strengthening precipitate, and thermally stable microstructure. Positive misfit between FCC matrix and precipitate has yielded parallel raft microstructure during creep at 982 °C, and the creep curves of HESA were dominated by tertiary creep behavior. To the best of authors' knowledge, this article is the first to present the elevated temperature tensile creep study on full scale specimens of a high entropy alloy, and the potential of HESA for high temperature structural application is discussed.

  13. Analysis of crude oil markets with improved multiscale weighted permutation entropy

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun; Liu, Cheng

    2018-03-01

    Entropy measures are recently extensively used to study the complexity property in nonlinear systems. Weighted permutation entropy (WPE) can overcome the ignorance of the amplitude information of time series compared with PE and shows a distinctive ability to extract complexity information from data having abrupt changes in magnitude. Improved (or sometimes called composite) multi-scale (MS) method possesses the advantage of reducing errors and improving the accuracy when applied to evaluate multiscale entropy values of not enough long time series. In this paper, we combine the merits of WPE and improved MS to propose the improved multiscale weighted permutation entropy (IMWPE) method for complexity investigation of a time series. Then it is validated effective through artificial data: white noise and 1 / f noise, and real market data of Brent and Daqing crude oil. Meanwhile, the complexity properties of crude oil markets are explored respectively of return series, volatility series with multiple exponents and EEMD-produced intrinsic mode functions (IMFs) which represent different frequency components of return series. Moreover, the instantaneous amplitude and frequency of Brent and Daqing crude oil are analyzed by the Hilbert transform utilized to each IMF.

  14. Probabilistic modelling of flood events using the entropy copula

    NASA Astrophysics Data System (ADS)

    Li, Fan; Zheng, Qian

    2016-11-01

    The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.

  15. Analysis of Age and Gender Structures for ICD-10 Diagnoses in Outpatient Treatment Using Shannon's Entropy.

    PubMed

    Schuster, Fabian; Ostermann, Thomas; Emcke, Timo; Schuster, Reinhard

    2017-01-01

    Diagnostic diversity has been in the focus of several studies of health services research. As the fraction of people with statutory health insurance changes with age and gender it is assumed that diagnostic diversity may be influenced by these parameters. We analyze fractions of patients in Schleswig-Holstein with respect to the chapters of the ICD-10 code in outpatient treatment for quarter 2/2016 with respect to age and gender/sex of the patient. In a first approach we analyzed which diagnose chapters are most relevant in dependence of age and gender. To detect diagnostic diversity, we finally applied Shannon's entropy measure. Due to multimorbidity we used different standardizations. Shannon entropy strongly increases for women after the age of 15, reaching a limit level at the age of 50 years. Between 15 and 70 years we get higher values for women, after 75 years for men. This article describes a straight forward pragmatic approach to diagnostic diversity using Shannon's Entropy. From a methodological point of view, the use of Shannon's entropy as a measure for diversity should gain more attraction to researchers of health services research.

  16. Optimization of rainfall networks using information entropy and temporal variability analysis

    NASA Astrophysics Data System (ADS)

    Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-04-01

    Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.

  17. Shannon and Renyi Entropies to Classify Effects of Mild Traumatic Brain Injury on Postural Sway

    PubMed Central

    Gao, Jianbo; Hu, Jing; Buckley, Thomas; White, Keith; Hass, Chris

    2011-01-01

    Background Mild Traumatic Brain Injury (mTBI) has been identified as a major public and military health concern both in the United States and worldwide. Characterizing the effects of mTBI on postural sway could be an important tool for assessing recovery from the injury. Methodology/Principal Findings We assess postural sway by motion of the center of pressure (COP). Methods for data reduction include calculation of area of COP and fractal analysis of COP motion time courses. We found that fractal scaling appears applicable to sway power above about 0.5 Hz, thus fractal characterization is only quantifying the secondary effects (a small fraction of total power) in the sway time series, and is not effective in quantifying long-term effects of mTBI on postural sway. We also found that the area of COP sensitively depends on the length of data series over which the COP is obtained. These weaknesses motivated us to use instead Shannon and Renyi entropies to assess postural instability following mTBI. These entropy measures have a number of appealing properties, including capacity for determination of the optimal length of the time series for analysis and a new interpretation of the area of COP. Conclusions Entropy analysis can readily detect postural instability in athletes at least 10 days post-concussion so that it appears promising as a sensitive measure of effects of mTBI on postural sway. Availability The programs for analyses may be obtained from the authors. PMID:21931720

  18. Shannon and Renyi entropies to classify effects of Mild Traumatic Brain Injury on postural sway.

    PubMed

    Gao, Jianbo; Hu, Jing; Buckley, Thomas; White, Keith; Hass, Chris

    2011-01-01

    Mild Traumatic Brain Injury (mTBI) has been identified as a major public and military health concern both in the United States and worldwide. Characterizing the effects of mTBI on postural sway could be an important tool for assessing recovery from the injury. We assess postural sway by motion of the center of pressure (COP). Methods for data reduction include calculation of area of COP and fractal analysis of COP motion time courses. We found that fractal scaling appears applicable to sway power above about 0.5 Hz, thus fractal characterization is only quantifying the secondary effects (a small fraction of total power) in the sway time series, and is not effective in quantifying long-term effects of mTBI on postural sway. We also found that the area of COP sensitively depends on the length of data series over which the COP is obtained. These weaknesses motivated us to use instead Shannon and Renyi entropies to assess postural instability following mTBI. These entropy measures have a number of appealing properties, including capacity for determination of the optimal length of the time series for analysis and a new interpretation of the area of COP. Entropy analysis can readily detect postural instability in athletes at least 10 days post-concussion so that it appears promising as a sensitive measure of effects of mTBI on postural sway. The programs for analyses may be obtained from the authors.

  19. Estimation of the magnetic entropy change by means of Landau theory and phenomenological model in La0.6Ca0.2 Sr0.2MnO3/Sb2O3 ceramic composites

    NASA Astrophysics Data System (ADS)

    Nasri, M.; Dhahri, E.; Hlil, E. K.

    2018-06-01

    In this paper, magnetocaloric properties of La0.6Ca0.2Sr0.2MnO3/Sb2O3 oxides have been investigated. The composite samples were prepared using the conventional solid-state reaction method. The second-order phase transition can be testified with the positive slope in Arrott plots. An excellent agreement has been found between the -ΔSM values estimated by Landau theory and those obtained using the classical Maxwell relation. The field dependence of the magnetic entropy change analysis shows a power law dependence,|ΔSM|≈Hn , with n(TC) = 0.65. Moreover, the scaling analysis of magnetic entropy change exhibits that ΔSM(T) curves collapse into a single universal curve, indicating that the observed paramagnetic to ferromagnetic phase transition is an authentic second-order phase transition. The maximum value of magnetic entropy change of composites is found to decrease slightly with the further increasing of Sb2O3 concentration. A phenomenological model was used to predict magnetocaloric properties of La0.6Ca0.2Sr0.2MnO3/Sb2O3 composites. The theoretical calculations are compared with the available experimental data.

  20. Dynamical noise filter and conditional entropy analysis in chaos synchronization.

    PubMed

    Wang, Jiao; Lai, C-H

    2006-06-01

    It is shown that, in a chaotic synchronization system whose driving signal is exposed to channel noise, the estimation of the drive system states can be greatly improved by applying the dynamical noise filtering to the response system states. If the noise is bounded in a certain range, the estimation errors, i.e., the difference between the filtered responding states and the driving states, can be made arbitrarily small. This property can be used in designing an alternative digital communication scheme. An analysis based on the conditional entropy justifies the application of dynamical noise filtering in generating quality synchronization.

  1. Thermodynamic laws in isolated systems.

    PubMed

    Hilbert, Stefan; Hänggi, Peter; Dunkel, Jörn

    2014-12-01

    The recent experimental realization of exotic matter states in isolated quantum systems and the ensuing controversy about the existence of negative absolute temperatures demand a careful analysis of the conceptual foundations underlying microcanonical thermostatistics. Here we provide a detailed comparison of the most commonly considered microcanonical entropy definitions, focusing specifically on whether they satisfy or violate the zeroth, first, and second laws of thermodynamics. Our analysis shows that, for a broad class of systems that includes all standard classical Hamiltonian systems, only the Gibbs volume entropy fulfills all three laws simultaneously. To avoid ambiguities, the discussion is restricted to exact results and analytically tractable examples.

  2. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  3. Assessing model uncertainty using hexavalent chromium and ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective of this analysis is to characterize model uncertainty by evaluating the variance in estimates across several epidemiologic analyses.Methods: This analysis compared 7 publications analyzing two different chromate production sites in Ohio and Maryland. The Ohio cohort consisted of 482 workers employed from 1940-72, while the Maryland site employed 2,357 workers from 1950-74. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability in estimates across and within model forms. A total of 7 similarly parameterized analyses were considered across model forms, and 23 analyses with alternative parameterizations were considered within model form (14 Cox; 9 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients for 7 similar analyses ranged from 2.47

  4. Analysis of fcc metals fracture behaviour: Fracture behaviour of fcc metals: brittle/ductile behaviour criteria : with ab-initio, embedded atom and pseudopotential parameterization for Au, Ir and Al. analysis for Au, Ir and Al.

    NASA Astrophysics Data System (ADS)

    Gornostyrev, Yu. N.; Katsnelson, M. I.; Mryasov, Oleg N.; Freeman, A. J.; Trefilov, M. V.

    1998-03-01

    Theoretical analysis of the fracture behaviour of fcc Au, Ir and Al have been performed within various brittle/ductile criteria (BDC) with ab-initio, embedded atom (EAM), and pseudopotential parameterizations. We systematically examined several important aspects of the fracture behaviour: (i) dislocation structure, (ii) energetics of the cleavage decohesion and (iii) character of the interatomic interactions. Unit dislocation structures were analyzed within a two dimensional generalization of the Peierls-Nabarro model with restoring forces determined from ab-initio total energy calculations and found to be split with well defined highly mobile partials for all considered metals. We find from ab-initio and pseudopotential that in contrast with most of fcc metals, cleavage decohesion curve for Al appreciably differs from UBER relation. Finally, using ab-initio, EAM and pseudopotential parameterizations, we demonstrate that (i) Au (as a typical example of a ductile metal) is well described within existing BDC's, (ii) anomalous cleavage-like crack propagation of Ir is driven predominantly by it's high elastic modulus and (iii) Al is not described within BDC due to it's long-range interatomic interactions (and hence requires adjustments of the brittle/ductile criteria).

  5. On quantum Rényi entropies: A new generalization and some properties

    NASA Astrophysics Data System (ADS)

    Müller-Lennert, Martin; Dupuis, Frédéric; Szehr, Oleg; Fehr, Serge; Tomamichel, Marco

    2013-12-01

    The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies, or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data-processing inequalities, a duality relation, and an entropic uncertainty relation.

  6. Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals

    PubMed Central

    Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.

    2016-01-01

    Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116

  7. Interim heterogeneity changes measured using entropy texture features on T2-weighted MRI at 3.0 T are associated with pathological response to neoadjuvant chemotherapy in primary breast cancer.

    PubMed

    Henderson, Shelley; Purdie, Colin; Michie, Caroline; Evans, Andrew; Lerski, Richard; Johnston, Marilyn; Vinnicombe, Sarah; Thompson, Alastair M

    2017-11-01

    To investigate whether interim changes in hetereogeneity (measured using entropy features) on MRI were associated with pathological residual cancer burden (RCB) at final surgery in patients receiving neoadjuvant chemotherapy (NAC) for primary breast cancer. This was a retrospective study of 88 consenting women (age: 30-79 years). Scanning was performed on a 3.0 T MRI scanner prior to NAC (baseline) and after 2-3 cycles of treatment (interim). Entropy was derived from the grey-level co-occurrence matrix, on slice-matched baseline/interim T2-weighted images. Response, assessed using RCB score on surgically resected specimens, was compared statistically with entropy/heterogeneity changes and ROC analysis performed. Association of pCR within each tumour immunophenotype was evaluated. Mean entropy percent differences between examinations, by response category, were: pCR: 32.8%, RCB-I: 10.5%, RCB-II: 9.7% and RCB-III: 3.0%. Association of ultimate pCR with coarse entropy changes between baseline/interim MRI across all lesions yielded 85.2% accuracy (area under ROC curve: 0.845). Excellent sensitivity/specificity was obtained for pCR prediction within each immunophenotype: ER+: 100%/100%; HER2+: 83.3%/95.7%, TNBC: 87.5%/80.0%. Lesion T2 heterogeneity changes are associated with response to NAC using RCB scores, particularly for pCR, and can be useful across all immunophenotypes with good diagnostic accuracy. • Texture analysis provides a means of measuring lesion heterogeneity on MRI images. • Heterogeneity changes between baseline/interim MRI can be linked with ultimate pathological response. • Heterogeneity changes give good diagnostic accuracy of pCR response across all immunophenotypes. • Percentage reduction in heterogeneity is associated with pCR with good accuracy and NPV.

  8. Probing the History of Galaxy Clusters with Metallicity and Entropy Measurements

    NASA Astrophysics Data System (ADS)

    Elkholy, Tamer Yohanna

    Galaxy clusters are the largest gravitationally bound objects found today in our Universe. The gas they contain, the intra-cluster medium (ICM), is heated to temperatures in the approximate range of 1 to 10 keV, and thus emits X-ray radiation. Studying the ICM through the spatial and spectral analysis of its emission returns the richest information about both the overall cosmological context which governs the formation of clusters, as well as the physical processes occurring within. The aim of this thesis is to learn about the history of the physical processes that drive the evolution of galaxy clusters, through careful, spatially resolved measurements of their metallicity and entropy content. A sample of 45 nearby clusters observed with Chandra is analyzed to produce radial density, temperature, entropy and metallicity profiles. The entropy profiles are computed to larger radial extents than in previous Chandra analyses. The results of this analysis are made available to the scientific community in an electronic database. Comparing metallicity and entropy in the outskirts of clusters, we find no signature on the entropy profiles of the ensemble of supernovae that produced the observed metals. In the centers of clusters, we find that the metallicities of high-mass clusters are much less dispersed than those of low-mass clusters. A comparison of metallicity with the regularity of the X-ray emission morphology suggests that metallicities in low-mass clusters are more susceptible to increase from violent events such as mergers. We also find that the variation in the stellar-to-gas mass ratio as a function of cluster mass can explain the variation of central metallicity with cluster mass, only if we assume that there is a constant level of metallicity for clusters of all masses, above which the observed galaxies add more metals in proportion to their mass. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  9. Quantification of knee vibroarthrographic signal irregularity associated with patellofemoral joint cartilage pathology based on entropy and envelope amplitude measures.

    PubMed

    Wu, Yunfeng; Chen, Pinnan; Luo, Xin; Huang, Hui; Liao, Lifang; Yao, Yuchen; Wu, Meihong; Rangayyan, Rangaraj M

    2016-07-01

    Injury of knee joint cartilage may result in pathological vibrations between the articular surfaces during extension and flexion motions. The aim of this paper is to analyze and quantify vibroarthrographic (VAG) signal irregularity associated with articular cartilage degeneration and injury in the patellofemoral joint. The symbolic entropy (SyEn), approximate entropy (ApEn), fuzzy entropy (FuzzyEn), and the mean, standard deviation, and root-mean-squared (RMS) values of the envelope amplitude, were utilized to quantify the signal fluctuations associated with articular cartilage pathology of the patellofemoral joint. The quadratic discriminant analysis (QDA), generalized logistic regression analysis (GLRA), and support vector machine (SVM) methods were used to perform signal pattern classifications. The experimental results showed that the patients with cartilage pathology (CP) possess larger SyEn and ApEn, but smaller FuzzyEn, over the statistical significance level of the Wilcoxon rank-sum test (p<0.01), than the healthy subjects (HS). The mean, standard deviation, and RMS values computed from the amplitude difference between the upper and lower signal envelopes are also consistently and significantly larger (p<0.01) for the group of CP patients than for the HS group. The SVM based on the entropy and envelope amplitude features can provide superior classification performance as compared with QDA and GLRA, with an overall accuracy of 0.8356, sensitivity of 0.9444, specificity of 0.8, Matthews correlation coefficient of 0.6599, and an area of 0.9212 under the receiver operating characteristic curve. The SyEn, ApEn, and FuzzyEn features can provide useful information about pathological VAG signal irregularity based on different entropy metrics. The statistical parameters of signal envelope amplitude can be used to characterize the temporal fluctuations related to the cartilage pathology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Gait alterations in the UAE population with and without diabetic complications using both traditional and entropy measures.

    PubMed

    Khalaf, Kinda; Al-Angari, Haitham M; Khandoker, Ahsan H; Lee, Sungmun; Almahmeed, Wael; Al Safar, Habiba S; Jelinek, Herbert F

    2017-10-01

    Diabetic foot, one of the most common and debilitating manifestations of type 2 diabetes mellitus (T2DM), is the leading cause of worldwide non-traumatic lower extremity amputations. Diabetics who are at risk of ulceration are currently mainly identified by a thorough clinical examination of the feet, which typically does not show clear symptoms during the early stages of disease progression. In this study, we used a non-linear dynamics tool, gait entropy (GaitEN), in addition to traditional linear gait analysis methods, to investigate gait alterations amongst diabetic patients with combinations of three types of T2DM related complications: retinopathy, diabetic peripheral neuropathy (DPN) and nephropathy. Peak plantar pressure (PPP) was not significantly different in the group with DPN as compared to the control group (diabetics with no complications, CONT) in the forefoot region (DPN: mean±SD: 396±69.4kPa, CONT: 409±68.9kPa), although it was significantly lower in the heel region (DPN: mean±SD: 285±43.1.4kPa, CONT: 295±61.8kPa). On the other hand, gait entropy was significantly lower for the DPN compared to CONT group (DPN: 0.95±0.34, CONT: 1.03±0.28, p<0.05). The significant low entropy was maintained when neuropathy was combined with either retinopathy or nephropathy. For the group with all three complications (ALL-C), the entropy was higher than CONT (ALL-C: 1.07±0.26). This may indicate an intrinsic sensorimotor feedback mechanism for the DPN patients to regulate their gait. However, this feedback gets weaker as patients develop multiple complications. Further analysis with longer walking time and different speeds is needed to verify the entropy results. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Parameter Estimation and Sensitivity Analysis of an Urban Surface Energy Balance Parameterization at a Tropical Suburban Site

    NASA Astrophysics Data System (ADS)

    Harshan, S.; Roth, M.; Velasco, E.

    2014-12-01

    Forecasting of the urban weather and climate is of great importance as our cities become more populated and considering the combined effects of global warming and local land use changes which make urban inhabitants more vulnerable to e.g. heat waves and flash floods. In meso/global scale models, urban parameterization schemes are used to represent the urban effects. However, these schemes require a large set of input parameters related to urban morphological and thermal properties. Obtaining all these parameters through direct measurements are usually not feasible. A number of studies have reported on parameter estimation and sensitivity analysis to adjust and determine the most influential parameters for land surface schemes in non-urban areas. Similar work for urban areas is scarce, in particular studies on urban parameterization schemes in tropical cities have so far not been reported. In order to address above issues, the town energy balance (TEB) urban parameterization scheme (part of the SURFEX land surface modeling system) was subjected to a sensitivity and optimization/parameter estimation experiment at a suburban site in, tropical Singapore. The sensitivity analysis was carried out as a screening test to identify the most sensitive or influential parameters. Thereafter, an optimization/parameter estimation experiment was performed to calibrate the input parameter. The sensitivity experiment was based on the "improved Sobol's global variance decomposition method" . The analysis showed that parameters related to road, roof and soil moisture have significant influence on the performance of the model. The optimization/parameter estimation experiment was performed using the AMALGM (a multi-algorithm genetically adaptive multi-objective method) evolutionary algorithm. The experiment showed a remarkable improvement compared to the simulations using the default parameter set. The calibrated parameters from this optimization experiment can be used for further model validation studies to identify inherent deficiencies in model physics.

  12. Upper entropy axioms and lower entropy axioms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jin-Li, E-mail: phd5816@163.com; Suo, Qi

    2015-04-15

    The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover,more » different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.« less

  13. Entropy-conservative spatial discretization of the multidimensional quasi-gasdynamic system of equations

    NASA Astrophysics Data System (ADS)

    Zlotnik, A. A.

    2017-04-01

    The multidimensional quasi-gasdynamic system written in the form of mass, momentum, and total energy balance equations for a perfect polytropic gas with allowance for a body force and a heat source is considered. A new conservative symmetric spatial discretization of these equations on a nonuniform rectangular grid is constructed (with the basic unknown functions—density, velocity, and temperature—defined on a common grid and with fluxes and viscous stresses defined on staggered grids). Primary attention is given to the analysis of entropy behavior: the discretization is specially constructed so that the total entropy does not decrease. This is achieved via a substantial revision of the standard discretization and applying numerous original features. A simplification of the constructed discretization serves as a conservative discretization with nondecreasing total entropy for the simpler quasi-hydrodynamic system of equations. In the absence of regularizing terms, the results also hold for the Navier-Stokes equations of a viscous compressible heat-conducting gas.

  14. Multiscale Shannon's Entropy Modeling of Orientation and Distance in Steel Fiber Micro-Tomography Data.

    PubMed

    Chiverton, John P; Ige, Olubisi; Barnett, Stephanie J; Parry, Tony

    2017-11-01

    This paper is concerned with the modeling and analysis of the orientation and distance between steel fibers in X-ray micro-tomography data. The advantage of combining both orientation and separation in a model is that it helps provide a detailed understanding of how the steel fibers are arranged, which is easy to compare. The developed models are designed to summarize the randomness of the orientation distribution of the steel fibers both locally and across an entire volume based on multiscale entropy. Theoretical modeling, simulation, and application to real imaging data are shown here. The theoretical modeling of multiscale entropy for orientation includes a proof showing the final form of the multiscale taken over a linear range of scales. A series of image processing operations are also included to overcome interslice connectivity issues to help derive the statistical descriptions of the orientation distributions of the steel fibers. The results demonstrate that multiscale entropy provides unique insights into both simulated and real imaging data of steel fiber reinforced concrete.

  15. EEG based topography analysis in string recognition task

    NASA Astrophysics Data System (ADS)

    Ma, Xiaofei; Huang, Xiaolin; Shen, Yuxiaotong; Qin, Zike; Ge, Yun; Chen, Ying; Ning, Xinbao

    2017-03-01

    Vision perception and recognition is a complex process, during which different parts of brain are involved depending on the specific modality of the vision target, e.g. face, character, or word. In this study, brain activities in string recognition task compared with idle control state are analyzed through topographies based on multiple measurements, i.e. sample entropy, symbolic sample entropy and normalized rhythm power, extracted from simultaneously collected scalp EEG. Our analyses show that, for most subjects, both symbolic sample entropy and normalized gamma power in string recognition task are significantly higher than those in idle state, especially at locations of P4, O2, T6 and C4. It implies that these regions are highly involved in string recognition task. Since symbolic sample entropy measures complexity, from the perspective of new information generation, and normalized rhythm power reveals the power distributions in frequency domain, complementary information about the underlying dynamics can be provided through the two types of indices.

  16. Rényi entropy measure of noise-aided information transmission in a binary channel.

    PubMed

    Chapeau-Blondeau, François; Rousseau, David; Delahaies, Agnès

    2010-05-01

    This paper analyzes a binary channel by means of information measures based on the Rényi entropy. The analysis extends, and contains as a special case, the classic reference model of binary information transmission based on the Shannon entropy measure. The extended model is used to investigate further possibilities and properties of stochastic resonance or noise-aided information transmission. The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order. Furthermore, in definite conditions, when seeking the Rényi information measures that best exploit stochastic resonance, then nontrivial orders differing from the Shannon case usually emerge. In this way, through binary information transmission, stochastic resonance identifies optimal Rényi measures of information differing from the classic Shannon measure. A confrontation of the quantitative information measures with visual perception is also proposed in an experiment of noise-aided binary image transmission.

  17. Ectopic beats in approximate entropy and sample entropy-based HRV assessment

    NASA Astrophysics Data System (ADS)

    Singh, Butta; Singh, Dilbag; Jaryal, A. K.; Deepak, K. K.

    2012-05-01

    Approximate entropy (ApEn) and sample entropy (SampEn) are the promising techniques for extracting complex characteristics of cardiovascular variability. Ectopic beats, originating from other than the normal site, are the artefacts contributing a serious limitation to heart rate variability (HRV) analysis. The approaches like deletion and interpolation are currently in use to eliminate the bias produced by ectopic beats. In this study, normal R-R interval time series of 10 healthy and 10 acute myocardial infarction (AMI) patients were analysed by inserting artificial ectopic beats. Then the effects of ectopic beats editing by deletion, degree-zero and degree-one interpolation on ApEn and SampEn have been assessed. Ectopic beats addition (even 2%) led to reduced complexity, resulting in decreased ApEn and SampEn of both healthy and AMI patient data. This reduction has been found to be dependent on level of ectopic beats. Editing of ectopic beats by interpolation degree-one method is found to be superior to other methods.

  18. Cardiorespiratory Information Dynamics during Mental Arithmetic and Sustained Attention

    PubMed Central

    Widjaja, Devy; Montalto, Alessandro; Vlemincx, Elke; Marinazzo, Daniele; Van Huffel, Sabine; Faes, Luca

    2015-01-01

    An analysis of cardiorespiratory dynamics during mental arithmetic, which induces stress, and sustained attention was conducted using information theory. The information storage and internal information of heart rate variability (HRV) were determined respectively as the self-entropy of the tachogram, and the self-entropy of the tachogram conditioned to the knowledge of respiration. The information transfer and cross information from respiration to HRV were assessed as the transfer and cross-entropy, both measures of cardiorespiratory coupling. These information-theoretic measures identified significant nonlinearities in the cardiorespiratory time series. Additionally, it was shown that, although mental stress is related to a reduction in vagal activity, no difference in cardiorespiratory coupling was found when several mental states (rest, mental stress, sustained attention) are compared. However, the self-entropy of HRV conditioned to respiration was very informative to study the predictability of RR interval series during mental tasks, and showed higher predictability during mental arithmetic compared to sustained attention or rest. PMID:26042824

  19. Shape design sensitivity analysis and optimization of three dimensional elastic solids using geometric modeling and automatic regridding. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Yao, Tse-Min; Choi, Kyung K.

    1987-01-01

    An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.

  20. Principle of maximum entropy for reliability analysis in the design of machine components

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  1. On holographic Rényi entropy in some modified theories of gravity

    NASA Astrophysics Data System (ADS)

    Dey, Anshuman; Roy, Pratim; Sarkar, Tapobrata

    2018-04-01

    We perform a detailed analysis of holographic entanglement Rényi entropy in some modified theories of gravity with four dimensional conformal field theory duals. First, we construct perturbative black hole solutions in a recently proposed model of Einsteinian cubic gravity in five dimensions, and compute the Rényi entropy as well as the scaling dimension of the twist operators in the dual field theory. Consistency of these results are verified from the AdS/CFT correspondence, via a corresponding computation of the Weyl anomaly on the gravity side. Similar analyses are then carried out for three other examples of modified gravity in five dimensions that include a chemical potential, namely Born-Infeld gravity, charged quasi-topological gravity and a class of Weyl corrected gravity theories with a gauge field, with the last example being treated perturbatively. Some interesting bounds in the dual conformal field theory parameters in quasi-topological gravity are pointed out. We also provide arguments on the validity of our perturbative analysis, whenever applicable.

  2. Fractal-Based Analysis of the Influence of Music on Human Respiration

    NASA Astrophysics Data System (ADS)

    Reza Namazi, H.

    An important challenge in respiration related studies is to investigate the influence of external stimuli on human respiration. Auditory stimulus is an important type of stimuli that influences human respiration. However, no one discovered any trend, which relates the characteristics of the auditory stimuli to the characteristics of the respiratory signal. In this paper, we investigate the correlation between auditory stimuli and respiratory signal from fractal point of view. We found out that the fractal structure of respiratory signal is correlated with the fractal structure of the applied music. Based on the obtained results, the music with greater fractal dimension will result in respiratory signal with smaller fractal dimension. In order to verify this result, we benefit from approximate entropy. The results show the respiratory signal will have smaller approximate entropy by choosing the music with smaller approximate entropy. The method of analysis could be further investigated to analyze the variations of different physiological time series due to the various types of stimuli when the complexity is the main concern.

  3. Chain architecture and micellization: A mean-field coarse-grained model for poly(ethylene oxide) alkyl ether surfactants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García Daza, Fabián A.; Mackie, Allan D., E-mail: allan.mackie@urv.cat; Colville, Alexander J.

    2015-03-21

    Microscopic modeling of surfactant systems is expected to be an important tool to describe, understand, and take full advantage of the micellization process for different molecular architectures. Here, we implement a single chain mean field theory to study the relevant equilibrium properties such as the critical micelle concentration (CMC) and aggregation number for three sets of surfactants with different geometries maintaining constant the number of hydrophobic and hydrophilic monomers. The results demonstrate the direct effect of the block organization for the surfactants under study by means of an analysis of the excess energy and entropy which can be accurately determinedmore » from the mean-field scheme. Our analysis reveals that the CMC values are sensitive to branching in the hydrophilic head part of the surfactant and can be observed in the entropy-enthalpy balance, while aggregation numbers are also affected by splitting the hydrophobic tail of the surfactant and are manifested by slight changes in the packing entropy.« less

  4. Chain architecture and micellization: A mean-field coarse-grained model for poly(ethylene oxide) alkyl ether surfactants

    NASA Astrophysics Data System (ADS)

    García Daza, Fabián A.; Colville, Alexander J.; Mackie, Allan D.

    2015-03-01

    Microscopic modeling of surfactant systems is expected to be an important tool to describe, understand, and take full advantage of the micellization process for different molecular architectures. Here, we implement a single chain mean field theory to study the relevant equilibrium properties such as the critical micelle concentration (CMC) and aggregation number for three sets of surfactants with different geometries maintaining constant the number of hydrophobic and hydrophilic monomers. The results demonstrate the direct effect of the block organization for the surfactants under study by means of an analysis of the excess energy and entropy which can be accurately determined from the mean-field scheme. Our analysis reveals that the CMC values are sensitive to branching in the hydrophilic head part of the surfactant and can be observed in the entropy-enthalpy balance, while aggregation numbers are also affected by splitting the hydrophobic tail of the surfactant and are manifested by slight changes in the packing entropy.

  5. Analysis of Eye Movements and Linguistic Boundaries in a Text for the Investigation of Japanese Reading Processes

    NASA Astrophysics Data System (ADS)

    Tera, Akemi; Shirai, Kiyoaki; Yuizono, Takaya; Sugiyama, Kozo

    In order to investigate reading processes of Japanese language learners, we have conducted an experiment to record eye movements during Japanese text reading using an eye-tracking system. We showed that Japanese native speakers use “forward and backward jumping eye movements” frequently[13],[14]. In this paper, we analyzed further the same eye tracking data. Our goal is to examine whether Japanese learners fix their eye movements at boundaries of linguistic units such as words, phrases or clauses when they start or end “backward jumping”. We consider conventional linguistic boundaries as well as boundaries empirically defined based on the entropy of the N-gram model. Another goal is to examine the relation between the entropy of the N-gram model and the depth of syntactic structures of sentences. Our analysis shows that (1) Japanese learners often fix their eyes at linguistic boundaries, (2) the average of the entropy is the greatest at the fifth depth of syntactic structures.

  6. Studies in the parameterization of cloudiness in climate models and the analysis of radiation fields in general circulation models

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1990-01-01

    Broad-band parameterizations for atmospheric radiative transfer were developed for clear and cloudy skies. These were in the shortwave and longwave regions of the spectrum. These models were compared with other models in an international effort called ICRCCM (Intercomparison of Radiation Codes for Climate Models). The radiation package developed was used for simulations of a General Circulation Model (GCM). A synopsis is provided of the research accomplishments in the two areas separately. Details are available in the published literature.

  7. Study of water based nanofluid flows in annular tubes using numerical simulation and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Siadaty, Moein; Kazazi, Mohsen

    2018-04-01

    Convective heat transfer, entropy generation and pressure drop of two water based nanofluids (Cu-water and Al2O3-water) in horizontal annular tubes are scrutinized by means of computational fluids dynamics, response surface methodology and sensitivity analysis. First, central composite design is used to perform a series of experiments with diameter ratio, length to diameter ratio, Reynolds number and solid volume fraction. Then, CFD is used to calculate the Nusselt Number, Euler number and entropy generation. After that, RSM is applied to fit second order polynomials on responses. Finally, sensitivity analysis is conducted to manage the above mentioned parameters inside tube. Totally, 62 different cases are examined. CFD results show that Cu-water and Al2O3-water have the highest and lowest heat transfer rate, respectively. In addition, analysis of variances indicates that increase in solid volume fraction increases dimensionless pressure drop for Al2O3-water. Moreover, it has a significant negative and insignificant effects on Cu-water Nusselt and Euler numbers, respectively. Analysis of Bejan number indicates that frictional and thermal entropy generations are the dominant irreversibility in Al2O3-water and Cu-water flows, respectively. Sensitivity analysis indicates dimensionless pressure drop sensitivity to tube length for Cu-water is independent of its diameter ratio at different Reynolds numbers.

  8. Optimization and uncertainty assessment of strongly nonlinear groundwater models with high parameter dimensionality

    NASA Astrophysics Data System (ADS)

    Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun

    2010-10-01

    Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.

  9. Statistical properties of the normalized ice particle size distribution

    NASA Astrophysics Data System (ADS)

    Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.

    2005-05-01

    Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000) parameterization. These new parameterizations are believed to better represent particle size at global scale, owing to a better representativity of the in situ microphysical database used to derive it. We then evaluated the potential of a direct N*0-Dm relationship. While the model parameterized by temperature produces strong errors on the cloud parameters, the N*0-Dm model parameterized by radar reflectivity produces accurate cloud parameters (less than 3% bias and 16% standard deviation). This result implies that the cloud parameters can be estimated from the estimate of only one parameter of the normalized PSD (N*0 or Dm) and a radar reflectivity measurement.

  10. Low-temperature heat capacity of diopside glass (CaMgSi2O6): A calorimetric test of the configurational-entropy theory applied to the viscosity of liquid silicates

    USGS Publications Warehouse

    Richet, P.; Robie, R.A.; Hemingway, B.S.

    1986-01-01

    Heat-capacity measurements have been made between 8 and 370 K on an annealed and a rapidly quenched diopside glass. Between 15 and 200 K, Cp does not depend significantly on the thermal history of the glass. Below 15 K Cp is larger for the quenched than for the annealed specimen. The opposite is true above 200 K as a result of what is interpreted as a secondary relaxation around room temperature. The magnitude of these effects, however, is small enough that the relative entropies S(298)-S(0) of the glasses differ by only 0.5 J/mol K, i.e., a figure within the combined experimental uncertainties. The insensitivity of relative entropies to thermal history supports the assumption that the configurational heat capacity of the liquid may be taken as the heat capacity difference between the liquid and the glass (??Cp). Furthermore, this insensitivity allows calculation of the residual entropies at 0 K of diopside glasses as a function of the fictive temperature from the entropy of fusion of diopside and the heat capacities of the crystalline, glassy and liquid phases. For a glass with a fictive temperature of 1005 K, for example, this calorimetric residual entropy is 24.3 ?? 3 J/mol K, in agreement with the prediction made by RICHET (1984) from an analysis of the viscosity data with the configurational-entropy theory of relaxation processes of Adam and Gibbs (1965). In turn, all the viscosity measurements for liquid diopside, which span the range 0.5-4?? 1013 poise, can be quantitatively reproduced through this theory with the calorimetrically determined entropies and ??Cp data. Finally, the unclear significance of "activation energies" for structural interpretations of viscosity data is emphasized, and the importance of ??Cp and glass-transition temperature systematics for determining the composition and temperature dependences of the viscosity is pointed out. ?? 1986.

  11. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  12. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  13. Classic Maximum Entropy Recovery of the Average Joint Distribution of Apparent FRET Efficiency and Fluorescence Photons for Single-molecule Burst Measurements

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2012-01-01

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694

  14. Gravitational surface Hamiltonian and entropy quantization

    NASA Astrophysics Data System (ADS)

    Bakshi, Ashish; Majhi, Bibhas Ranjan; Samanta, Saurav

    2017-02-01

    The surface Hamiltonian corresponding to the surface part of a gravitational action has xp structure where p is conjugate momentum of x. Moreover, it leads to TS on the horizon of a black hole. Here T and S are temperature and entropy of the horizon. Imposing the hermiticity condition we quantize this Hamiltonian. This leads to an equidistant spectrum of its eigenvalues. Using this we show that the entropy of the horizon is quantized. This analysis holds for any order of Lanczos-Lovelock gravity. For general relativity, the area spectrum is consistent with Bekenstein's observation. This provides a more robust confirmation of this earlier result as the calculation is based on the direct quantization of the Hamiltonian in the sense of usual quantum mechanics.

  15. Entropy production in a Glauber–Ising irreversible model with dynamical competition

    NASA Astrophysics Data System (ADS)

    Barbosa, Oscar A.; Tomé, Tânia

    2018-06-01

    An out of equilibrium Glauber–Ising model, evolving in accordance with an irreversible and stochastic Markovian dynamics, is analyzed in order to improve our comprehension concerning critical behavior and phase transitions in nonequilibrium systems. Therefore, a lattice model ruled by the competition between two Glauber dynamics acting on interlaced square lattices is proposed. Previous results have shown how the entropy production provides information about irreversibility and criticality. Mean-field approximations and Monte Carlo simulations were used in the analysis. The results obtained here show a continuous phase transition, reflected in the entropy production as a logarithmic divergence of its derivative, which suggests a shared universality class with the irreversible models invariant under the symmetry operations of the Ising model.

  16. Effects of radial distribution of entropy diffusivity on critical modes of anelastic thermal convection in rotating spherical shells

    NASA Astrophysics Data System (ADS)

    Sasaki, Youhei; Takehiro, Shin-ichi; Ishiwatari, Masaki; Yamada, Michio

    2018-03-01

    Linear stability analysis of anelastic thermal convection in a rotating spherical shell with entropy diffusivities varying in the radial direction is performed. The structures of critical convection are obtained in the cases of four different radial distributions of entropy diffusivity; (1) κ is constant, (2) κT0 is constant, (3) κρ0 is constant, and (4) κρ0T0 is constant, where κ is the entropy diffusivity, T0 is the temperature of basic state, and ρ0 is the density of basic state, respectively. The ratio of inner and outer radii, the Prandtl number, the polytropic index, and the density ratio are 0.35, 1, 2, and 5, respectively. The value of the Ekman number is 10-3 or 10-5 . In the case of (1), where the setup is same as that of the anelastic dynamo benchmark (Jones et al., 2011), the structure of critical convection is concentrated near the outer boundary of the spherical shell around the equator. However, in the cases of (2), (3) and (4), the convection columns attach the inner boundary of the spherical shell. A rapidly rotating annulus model for anelastic systems is developed by assuming that convection structure is uniform in the axial direction taking into account the strong effect of Coriolis force. The annulus model well explains the characteristics of critical convection obtained numerically, such as critical azimuthal wavenumber, frequency, Rayleigh number, and the cylindrically radial location of convection columns. The radial distribution of entropy diffusivity, or more generally, diffusion properties in the entropy equation, is important for convection structure, because it determines the distribution of radial basic entropy gradient which is crucial for location of convection columns.

  17. Entropy of Masseter Muscle Pain Sensitivity: A New Technique for Pain Assessment.

    PubMed

    Castrillon, Eduardo E; Exposto, Fernando G; Sato, Hitoshi; Tanosoto, Tomohiro; Arima, Taro; Baad-Hansen, Lene; Svensson, Peter

    2017-01-01

    To test whether manipulation of mechanical pain sensitivity (MPS) of the masseter muscle is reflected in quantitative measures of entropy. In a randomized, single-blinded, placebo-controlled design, 20 healthy volunteers had glutamate, lidocaine, and isotonic saline injected into the masseter muscle. Self-assessed pain intensity on a numeric rating scale (NRS) was evaluated up to 10 minutes following the injection, and MPS was evaluated after application (at 5 minutes and 30 minutes) of three different forces (0.5 kg, 1 kg, and 2 kg) to 15 different sites of the masseter muscle. Finally, the entropy and center of gravity (COG) of the pain sensitivity scores were calculated. Analysis of variance was used to test differences in means of tested outcomes and Tukey post hoc tests were used to adjust for multiple comparisons. The main findings were: (1) Compared with both lidocaine and isotonic saline, glutamate injections caused an increase in peak, duration, and area under the NRS pain curve (P < .01); (2) A pressure of 2 kg caused the highest NRS pain scores (P < .03) and entropy values (P < .02); (3) Glutamate injections caused increases in entropy values when assessed with 0.5 kg and 1.0 kg but not with 2.0 kg of pressure; and (4) COG coordinates revealed differences between the x coordinates for time (P < .01) and time and force for the y coordinates (P < .01). These results suggest that manipulation of MPS of the masseter muscle with painful glutamate injections can increase the diversity of MPS, which is reflected in entropy measures. Entropy allows quantification of the diversity of MPS, which may be important in clinical assessment of pain states such as myofascial temporomandibular disorders.

  18. Performance Analysis of Entropy Methods on K Means in Clustering Process

    NASA Astrophysics Data System (ADS)

    Dicky Syahputra Lubis, Mhd.; Mawengkang, Herman; Suwilo, Saib

    2017-12-01

    K Means is a non-hierarchical data clustering method that attempts to partition existing data into one or more clusters / groups. This method partitions the data into clusters / groups so that data that have the same characteristics are grouped into the same cluster and data that have different characteristics are grouped into other groups.The purpose of this data clustering is to minimize the objective function set in the clustering process, which generally attempts to minimize variation within a cluster and maximize the variation between clusters. However, the main disadvantage of this method is that the number k is often not known before. Furthermore, a randomly chosen starting point may cause two points to approach the distance to be determined as two centroids. Therefore, for the determination of the starting point in K Means used entropy method where this method is a method that can be used to determine a weight and take a decision from a set of alternatives. Entropy is able to investigate the harmony in discrimination among a multitude of data sets. Using Entropy criteria with the highest value variations will get the highest weight. Given this entropy method can help K Means work process in determining the starting point which is usually determined at random. Thus the process of clustering on K Means can be more quickly known by helping the entropy method where the iteration process is faster than the K Means Standard process. Where the postoperative patient dataset of the UCI Repository Machine Learning used and using only 12 data as an example of its calculations is obtained by entropy method only with 2 times iteration can get the desired end result.

  19. Hydrostatic Chandra X-ray analysis of SPT-selected galaxy clusters - I. Evolution of profiles and core properties

    NASA Astrophysics Data System (ADS)

    Sanders, J. S.; Fabian, A. C.; Russell, H. R.; Walker, S. A.

    2018-02-01

    We analyse Chandra X-ray Observatory observations of a set of galaxy clusters selected by the South Pole Telescope using a new publicly available forward-modelling projection code, MBPROJ2, assuming hydrostatic equilibrium. By fitting a power law plus constant entropy model we find no evidence for a central entropy floor in the lowest entropy systems. A model of the underlying central entropy distribution shows a narrow peak close to zero entropy which accounts for 60 per cent of the systems, and a second broader peak around 130 keV cm2. We look for evolution over the 0.28-1.2 redshift range of the sample in density, pressure, entropy and cooling time at 0.015R500 and at 10 kpc radius. By modelling the evolution of the central quantities with a simple model, we find no evidence for a non-zero slope with redshift. In addition, a non-parametric sliding median shows no significant change. The fraction of cool-core clusters with central cooling times below 2 Gyr is consistent above and below z = 0.6 (˜30-40 per cent). Both by comparing the median thermodynamic profiles, centrally biased towards cool cores, in two redshift bins, and by modelling the evolution of the unbiased average profile as a function of redshift, we find no significant evolution beyond self-similar scaling in any of our examined quantities. Our average modelled radial density, entropy and cooling-time profiles appear as power laws with breaks around 0.2R500. The dispersion in these quantities rises inwards of this radius to around 0.4 dex, although some of this scatter can be fitted by a bimodal model.

  20. Entropy considerations applied to shock unsteadiness in hypersonic inlets

    NASA Astrophysics Data System (ADS)

    Bussey, Gillian Mary Harding

    The stability of curved or rectangular shocks in hypersonic inlets in response to flow perturbations can be determined analytically from the principle of minimum entropy. Unsteady shock wave motion can have a significant effect on the flow in a hypersonic inlet or combustor. According to the principle of minimum entropy, a stable thermodynamic state is one with the lowest entropy gain. A model based on piston theory and its limits has been developed for applying the principle of minimum entropy to quasi-steady flow. Relations are derived for analyzing the time-averaged entropy gain flux across a shock for quasi-steady perturbations in atmospheric conditions and angle as a perturbation in entropy gain flux from the steady state. Initial results from sweeping a wedge at Mach 10 through several degrees in AEDC's Tunnel 9 indicates the bow shock becomes unsteady near the predicted normal Mach number. Several curved shocks of varying curvature are compared to a straight shock with the same mean normal Mach number, pressure ratio, or temperature ratio. The present work provides analysis and guidelines for designing an inlet robust to off- design flight or perturbations in flow conditions an inlet is likely to face. It also suggests that inlets with curved shocks are less robust to off-design flight than those with straight shocks such as rectangular inlets. Relations for evaluating entropy perturbations for highly unsteady flow across a shock and limits on their use were also developed. The normal Mach number at which a shock could be stable to high frequency upstream perturbations increases as the speed of the shock motion increases and slightly decreases as the perturbation size increases. The present work advances the principle of minimum entropy theory by providing additional validity for using the theory for time-varying flows and applying it to shocks, specifically those in inlets. While this analytic tool is applied in the present work for evaluating the stability of shocks in hypersonic inlets, it can be used for an arbitrary application with a shock.

  1. Fundamental statistical relationships between monthly and daily meteorological variables: Temporal downscaling of weather based on a global observational dataset

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp; Kaplan, Jed

    2016-04-01

    Accurate modelling of large-scale vegetation dynamics, hydrology, and other environmental processes requires meteorological forcing on daily timescales. While meteorological data with high temporal resolution is becoming increasingly available, simulations for the future or distant past are limited by lack of data and poor performance of climate models, e.g., in simulating daily precipitation. To overcome these limitations, we may temporally downscale monthly summary data to a daily time step using a weather generator. Parameterization of such statistical models has traditionally been based on a limited number of observations. Recent developments in the archiving, distribution, and analysis of "big data" datasets provide new opportunities for the parameterization of a temporal downscaling model that is applicable over a wide range of climates. Here we parameterize a WGEN-type weather generator using more than 50 million individual daily meteorological observations, from over 10'000 stations covering all continents, based on the Global Historical Climatology Network (GHCN) and Synoptic Cloud Reports (EECRA) databases. Using the resulting "universal" parameterization and driven by monthly summaries, we downscale mean temperature (minimum and maximum), cloud cover, and total precipitation, to daily estimates. We apply a hybrid gamma-generalized Pareto distribution to calculate daily precipitation amounts, which overcomes much of the inability of earlier weather generators to simulate high amounts of daily precipitation. Our globally parameterized weather generator has numerous applications, including vegetation and crop modelling for paleoenvironmental studies.

  2. Shannon information entropy in heavy-ion collisions

    NASA Astrophysics Data System (ADS)

    Ma, Chun-Wang; Ma, Yu-Gang

    2018-03-01

    The general idea of information entropy provided by C.E. Shannon "hangs over everything we do" and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc., are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.

  3. Pion, Kaon, Proton and Antiproton Production in Proton-Proton Collisions

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Blattnig, Steve R.

    2008-01-01

    Inclusive pion, kaon, proton, and antiproton production from proton-proton collisions is studied at a variety of proton energies. Various available parameterizations of Lorentz-invariant differential cross sections as a function of transverse momentum and rapidity are compared with experimental data. The Badhwar and Alper parameterizations are moderately satisfactory for charged pion production. The Badhwar parameterization provides the best fit for charged kaon production. For proton production, the Alper parameterization is best, and for antiproton production the Carey parameterization works best. However, no parameterization is able to fully account for all the data.

  4. Quadrantal multi-scale distribution entropy analysis of heartbeat interval series based on a modified Poincaré plot

    NASA Astrophysics Data System (ADS)

    Huo, Chengyu; Huang, Xiaolin; Zhuang, Jianjun; Hou, Fengzhen; Ni, Huangjing; Ning, Xinbao

    2013-09-01

    The Poincaré plot is one of the most important approaches in human cardiac rhythm analysis. However, further investigations are still needed to concentrate on techniques that can characterize the dispersion of the points displayed by a Poincaré plot. Based on a modified Poincaré plot, we provide a novel measurement named distribution entropy (DE) and propose a quadrantal multi-scale distribution entropy analysis (QMDE) for the quantitative descriptions of the scatter distribution patterns in various regions and temporal scales. We apply this method to the heartbeat interval series derived from healthy subjects and congestive heart failure (CHF) sufferers, respectively, and find that the discriminations between them are most significant in the first quadrant, which implies significant impacts on vagal regulation brought about by CHF. We also investigate the day-night differences of young healthy people, and it is shown that the results present a clearly circadian rhythm, especially in the first quadrant. In addition, the multi-scale analysis indicates that the results of healthy subjects and CHF sufferers fluctuate in different trends with variation of the scale factor. The same phenomenon also appears in circadian rhythm investigations of young healthy subjects, which implies that the cardiac dynamic system is affected differently in various temporal scales by physiological or pathological factors.

  5. Microcanonical entropy for classical systems

    NASA Astrophysics Data System (ADS)

    Franzosi, Roberto

    2018-03-01

    The entropy definition in the microcanonical ensemble is revisited. We propose a novel definition for the microcanonical entropy that resolve the debate on the correct definition of the microcanonical entropy. In particular we show that this entropy definition fixes the problem inherent the exact extensivity of the caloric equation. Furthermore, this entropy reproduces results which are in agreement with the ones predicted with standard Boltzmann entropy when applied to macroscopic systems. On the contrary, the predictions obtained with the standard Boltzmann entropy and with the entropy we propose, are different for small system sizes. Thus, we conclude that the Boltzmann entropy provides a correct description for macroscopic systems whereas extremely small systems should be better described with the entropy that we propose here.

  6. Shallow water equations: viscous solutions and inviscid limit

    NASA Astrophysics Data System (ADS)

    Chen, Gui-Qiang; Perepelitsa, Mikhail

    2012-12-01

    We establish the inviscid limit of the viscous shallow water equations to the Saint-Venant system. For the viscous equations, the viscosity terms are more degenerate when the shallow water is close to the bottom, in comparison with the classical Navier-Stokes equations for barotropic gases; thus, the analysis in our earlier work for the classical Navier-Stokes equations does not apply directly, which require new estimates to deal with the additional degeneracy. We first introduce a notion of entropy solutions to the viscous shallow water equations and develop an approach to establish the global existence of such solutions and their uniform energy-type estimates with respect to the viscosity coefficient. These uniform estimates yield the existence of measure-valued solutions to the Saint-Venant system generated by the viscous solutions. Based on the uniform energy-type estimates and the features of the Saint-Venant system, we further establish that the entropy dissipation measures of the viscous solutions for weak entropy-entropy flux pairs, generated by compactly supported C 2 test-functions, are confined in a compact set in H -1, which yields that the measure-valued solutions are confined by the Tartar-Murat commutator relation. Then, the reduction theorem established in Chen and Perepelitsa [5] for the measure-valued solutions with unbounded support leads to the convergence of the viscous solutions to a finite-energy entropy solution of the Saint-Venant system with finite-energy initial data, which is relative with respect to the different end-states of the bottom topography of the shallow water at infinity. The analysis also applies to the inviscid limit problem for the Saint-Venant system in the presence of friction.

  7. Information Transfer Analysis of Spontaneous Low-frequency Fluctuations in Cerebral Hemodynamics and Cardiovascular Dynamics

    NASA Astrophysics Data System (ADS)

    Katura, Takusige; Tanaka, Naoki; Obata, Akiko; Sato, Hiroki; Maki, Atsushi

    2005-08-01

    In this study, from the information-theoretic viewpoint, we analyzed the interrelation between the spontaneous low-frequency fluctuations around 0.1Hz in the hemoglobin concentration in the cerebral cortex, mean arterial blood pressure and the heart rate. For this analysis, as measures of information transfer, we used transfer entropy (TE) proposed for two-factor systems by Schreiber and intrinsic transfer entropy (ITE) introduced for further analysis of three-factor systems by extending the original TE. In our analysis, information transfer analysis based on both TE and ITE suggests the systemic cardiovascular fluctuations alone cannot account for the cerebrovascular fluctuations, that is, the regulation of the regional cerebral energetic metabolism is important as a candidate of its generation mechanism Such an information transfer analysis seems useful to reveal the interrelation between the elements regulated each other in a complex manner.

  8. Respiration and heart rate complexity: Effects of age and gender assessed by band-limited transfer entropy

    PubMed Central

    Nemati, Shamim; Edwards, Bradley A.; Lee, Joon; Pittman-Polletta, Benjamin; Butler, James P.; Malhotra, Atul

    2013-01-01

    Aging and disease are accompanied with a reduction of complex variability in the temporal patterns of heart rate. This reduction has been attributed to a break down of the underlying regulatory feedback mechanisms that maintain a homeodynamic state. Previous work has established the utility of entropy as an index of disorder, for quantification of changes in heart rate complexity. However, questions remain regarding the origin of heart rate complexity and the mechanisms involved in its reduction with aging and disease. In this work we use a newly developed technique based on the concept of band-limited transfer entropy to assess the aging-related changes in contribution of respiration and blood pressure to entropy of heart rate at different frequency bands. Noninvasive measurements of heart beat interval, respiration, and systolic blood pressure were recorded from 20 young (21–34 years) and 20 older (68–85 years) healthy adults. Band-limited transfer entropy analysis revealed a reduction in high-frequency contribution of respiration to heart rate complexity (p < 0.001) with normal aging, particularly in men. These results have the potential for dissecting the relative contributions of respiration and blood pressure-related reflexes to heart rate complexity and their degeneration with normal aging. PMID:23811194

  9. Naturalistic stimulation changes the dynamic response of action potential encoding in a mechanoreceptor

    PubMed Central

    Pfeiffer, Keram; French, Andrew S.

    2015-01-01

    Naturalistic signals were created from vibrations made by locusts walking on a Sansevieria plant. Both naturalistic and Gaussian noise signals were used to mechanically stimulate VS-3 slit-sense mechanoreceptor neurons of the spider, Cupiennius salei, with stimulus amplitudes adjusted to give similar firing rates for either stimulus. Intracellular microelectrodes recorded action potentials, receptor potential, and receptor current, using current clamp and voltage clamp. Frequency response analysis showed that naturalistic stimulation contained relatively more power at low frequencies, and caused increased neuronal sensitivity to higher frequencies. In contrast, varying the amplitude of Gaussian stimulation did not change neuronal dynamics. Naturalistic stimulation contained less entropy than Gaussian, but signal entropy was higher than stimulus in the resultant receptor current, indicating addition of uncorrelated noise during transduction. The presence of added noise was supported by measuring linear information capacity in the receptor current. Total entropy and information capacity in action potentials produced by either stimulus were much lower than in earlier stages, and limited to the maximum entropy of binary signals. We conclude that the dynamics of action potential encoding in VS-3 neurons are sensitive to the form of stimulation, but entropy and information capacity of action potentials are limited by firing rate. PMID:26578975

  10. Inverting ion images without Abel inversion: maximum entropy reconstruction of velocity maps.

    PubMed

    Dick, Bernhard

    2014-01-14

    A new method for the reconstruction of velocity maps from ion images is presented, which is based on the maximum entropy concept. In contrast to other methods used for Abel inversion the new method never applies an inversion or smoothing to the data. Instead, it iteratively finds the map which is the most likely cause for the observed data, using the correct likelihood criterion for data sampled from a Poissonian distribution. The entropy criterion minimizes the information content in this map, which hence contains no information for which there is no evidence in the data. Two implementations are proposed, and their performance is demonstrated with simulated and experimental data: Maximum Entropy Velocity Image Reconstruction (MEVIR) obtains a two-dimensional slice through the velocity distribution and can be compared directly to Abel inversion. Maximum Entropy Velocity Legendre Reconstruction (MEVELER) finds one-dimensional distribution functions Q(l)(v) in an expansion of the velocity distribution in Legendre polynomials P((cos θ) for the angular dependence. Both MEVIR and MEVELER can be used for the analysis of ion images with intensities as low as 0.01 counts per pixel, with MEVELER performing significantly better than MEVIR for images with low intensity. Both methods perform better than pBASEX, in particular for images with less than one average count per pixel.

  11. Nonlinear digital signal processing in mental health: characterization of major depression using instantaneous entropy measures of heartbeat dynamics.

    PubMed

    Valenza, Gaetano; Garcia, Ronald G; Citi, Luca; Scilingo, Enzo P; Tomaz, Carlos A; Barbieri, Riccardo

    2015-01-01

    Nonlinear digital signal processing methods that address system complexity have provided useful computational tools for helping in the diagnosis and treatment of a wide range of pathologies. More specifically, nonlinear measures have been successful in characterizing patients with mental disorders such as Major Depression (MD). In this study, we propose the use of instantaneous measures of entropy, namely the inhomogeneous point-process approximate entropy (ipApEn) and the inhomogeneous point-process sample entropy (ipSampEn), to describe a novel characterization of MD patients undergoing affective elicitation. Because these measures are built within a nonlinear point-process model, they allow for the assessment of complexity in cardiovascular dynamics at each moment in time. Heartbeat dynamics were characterized from 48 healthy controls and 48 patients with MD while emotionally elicited through either neutral or arousing audiovisual stimuli. Experimental results coming from the arousing tasks show that ipApEn measures are able to instantaneously track heartbeat complexity as well as discern between healthy subjects and MD patients. Conversely, standard heart rate variability (HRV) analysis performed in both time and frequency domains did not show any statistical significance. We conclude that measures of entropy based on nonlinear point-process models might contribute to devising useful computational tools for care in mental health.

  12. The effect of orthostatic stress on multiscale entropy of heart rate and blood pressure.

    PubMed

    Turianikova, Zuzana; Javorka, Kamil; Baumert, Mathias; Calkovska, Andrea; Javorka, Michal

    2011-09-01

    Cardiovascular control acts over multiple time scales, which introduces a significant amount of complexity to heart rate and blood pressure time series. Multiscale entropy (MSE) analysis has been developed to quantify the complexity of a time series over multiple time scales. In previous studies, MSE analyses identified impaired cardiovascular control and increased cardiovascular risk in various pathological conditions. Despite the increasing acceptance of the MSE technique in clinical research, information underpinning the involvement of the autonomic nervous system in the MSE of heart rate and blood pressure is lacking. The objective of this study is to investigate the effect of orthostatic challenge on the MSE of heart rate and blood pressure variability (HRV, BPV) and the correlation between MSE (complexity measures) and traditional linear (time and frequency domain) measures. MSE analysis of HRV and BPV was performed in 28 healthy young subjects on 1000 consecutive heart beats in the supine and standing positions. Sample entropy values were assessed on scales of 1-10. We found that MSE of heart rate and blood pressure signals is sensitive to changes in autonomic balance caused by postural change from the supine to the standing position. The effect of orthostatic challenge on heart rate and blood pressure complexity depended on the time scale under investigation. Entropy values did not correlate with the mean values of heart rate and blood pressure and showed only weak correlations with linear HRV and BPV measures. In conclusion, the MSE analysis of heart rate and blood pressure provides a sensitive tool to detect changes in autonomic balance as induced by postural change.

  13. On S-mixing entropy of quantum channels

    NASA Astrophysics Data System (ADS)

    Mukhamedov, Farrukh; Watanabe, Noboru

    2018-06-01

    In this paper, an S-mixing entropy of quantum channels is introduced as a generalization of Ohya's S-mixing entropy. We investigate several properties of the introduced entropy. Moreover, certain relations between the S-mixing entropy and the existing map and output entropies of quantum channels are investigated as well. These relations allowed us to find certain connections between separable states and the introduced entropy. Hence, there is a sufficient condition to detect entangled states. Moreover, several properties of the introduced entropy are investigated. Besides, entropies of qubit and phase-damping channels are calculated.

  14. Latent Class Analysis of Incomplete Data via an Entropy-Based Criterion

    PubMed Central

    Larose, Chantal; Harel, Ofer; Kordas, Katarzyna; Dey, Dipak K.

    2016-01-01

    Latent class analysis is used to group categorical data into classes via a probability model. Model selection criteria then judge how well the model fits the data. When addressing incomplete data, the current methodology restricts the imputation to a single, pre-specified number of classes. We seek to develop an entropy-based model selection criterion that does not restrict the imputation to one number of clusters. Simulations show the new criterion performing well against the current standards of AIC and BIC, while a family studies application demonstrates how the criterion provides more detailed and useful results than AIC and BIC. PMID:27695391

  15. Analysis of rapid eye movement periodicity in narcoleptics based on maximum entropy method.

    PubMed

    Honma, H; Ohtomo, N; Kohsaka, M; Fukuda, N; Kobayashi, R; Sakakibara, S; Nakamura, F; Koyama, T

    1999-04-01

    We examined REM sleep periodicity in typical narcoleptics and patients who had shown signs of a narcoleptic tetrad without HLA-DRB1*1501/DQB1*0602 or DR2 antigens, using spectral analysis based on the maximum entropy method. The REM sleep period of typical narcoleptics showed two peaks, one at 70-90 min and one at 110-130 min at night, and a single peak at around 70-90 min during the daytime. The nocturnal REM sleep period of typical narcoleptics may be composed of several different periods, one of which corresponds to that of their daytime REM sleep.

  16. An automatic classifier of emotions built from entropy of noise.

    PubMed

    Ferreira, Jacqueline; Brás, Susana; Silva, Carlos F; Soares, Sandra C

    2017-04-01

    The electrocardiogram (ECG) signal has been widely used to study the physiological substrates of emotion. However, searching for better filtering techniques in order to obtain a signal with better quality and with the maximum relevant information remains an important issue for researchers in this field. Signal processing is largely performed for ECG analysis and interpretation, but this process can be susceptible to error in the delineation phase. In addition, it can lead to the loss of important information that is usually considered as noise and, consequently, discarded from the analysis. The goal of this study was to evaluate if the ECG noise allows for the classification of emotions, while using its entropy as an input in a decision tree classifier. We collected the ECG signal from 25 healthy participants while they were presented with videos eliciting negative (fear and disgust) and neutral emotions. The results indicated that the neutral condition showed a perfect identification (100%), whereas the classification of negative emotions indicated good identification performances (60% of sensitivity and 80% of specificity). These results suggest that the entropy of noise contains relevant information that can be useful to improve the analysis of the physiological correlates of emotion. © 2016 Society for Psychophysiological Research.

  17. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise

    NASA Astrophysics Data System (ADS)

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ -stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α . We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ -stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  18. Thin Interface Asymptotics for an Energy/Entropy Approach to Phase-Field Models with Unequal Conductivities

    NASA Technical Reports Server (NTRS)

    McFadden, G. B.; Wheeler, A. A.; Anderson, D. M.

    1999-01-01

    Karma and Rapped recently developed a new sharp interface asymptotic analysis of the phase-field equations that is especially appropriate for modeling dendritic growth at low undercoolings. Their approach relieves a stringent restriction on the interface thickness that applies in the conventional asymptotic analysis, and has the added advantage that interfacial kinetic effects can also be eliminated. However, their analysis focussed on the case of equal thermal conductivities in the solid and liquid phases; when applied to a standard phase-field model with unequal conductivities, anomalous terms arise in the limiting forms of the boundary conditions for the interfacial temperature that are not present in conventional sharp-interface solidification models, as discussed further by Almgren. In this paper we apply their asymptotic methodology to a generalized phase-field model which is derived using a thermodynamically consistent approach that is based on independent entropy and internal energy gradient functionals that include double wells in both the entropy and internal energy densities. The additional degrees of freedom associated with the generalized phased-field equations can be chosen to eliminate the anomalous terms that arise for unequal conductivities.

  19. Scaling characteristics of one-dimensional fractional diffusion processes in the presence of power-law distributed random noise.

    PubMed

    Nezhadhaghighi, Mohsen Ghasemi

    2017-08-01

    Here, we present results of numerical simulations and the scaling characteristics of one-dimensional random fluctuations with heavy-tailed probability distribution functions. Assuming that the distribution function of the random fluctuations obeys Lévy statistics with a power-law scaling exponent, we investigate the fractional diffusion equation in the presence of μ-stable Lévy noise. We study the scaling properties of the global width and two-point correlation functions and then compare the analytical and numerical results for the growth exponent β and the roughness exponent α. We also investigate the fractional Fokker-Planck equation for heavy-tailed random fluctuations. We show that the fractional diffusion processes in the presence of μ-stable Lévy noise display special scaling properties in the probability distribution function (PDF). Finally, we numerically study the scaling properties of the heavy-tailed random fluctuations by using the diffusion entropy analysis. This method is based on the evaluation of the Shannon entropy of the PDF generated by the random fluctuations, rather than on the measurement of the global width of the process. We apply the diffusion entropy analysis to extract the growth exponent β and to confirm the validity of our numerical analysis.

  20. Cattaneo-Christov based study of {TiO}_2 -CuO/EG Casson hybrid nanofluid flow over a stretching surface with entropy generation

    NASA Astrophysics Data System (ADS)

    Jamshed, Wasim; Aziz, Asim

    2018-06-01

    In the present research, a simplified mathematical model is presented to study the heat transfer and entropy generation analysis of thermal system containing hybrid nanofluid. Nanofluid occupies the space over an infinite horizontal surface and the flow is induced by the non-linear stretching of surface. A uniform transverse magnetic field, Cattaneo-Christov heat flux model and thermal radiation effects are also included in the present study. The similarity technique is employed to reduce the governing non-linear partial differential equations to a set of ordinary differential equation. Keller Box numerical scheme is then used to approximate the solutions for the thermal analysis. Results are presented for conventional copper oxide-ethylene glycol (CuO-EG) and hybrid titanium-copper oxide/ethylene glycol ({TiO}_2 -CuO/EG) nanofluids. The spherical, hexahedron, tetrahedron, cylindrical, and lamina-shaped nanoparticles are considered in the present analysis. The significant findings of the study is the enhanced heat transfer capability of hybrid nanofluids over the conventional nanofluids, greatest heat transfer rate for the smallest value of the shape factor parameter and the increase in Reynolds number and Brinkman number increases the overall entropy of the system.

  1. Multiscale entropy analysis of human gait dynamics

    NASA Astrophysics Data System (ADS)

    Costa, M.; Peng, C.-K.; L. Goldberger, Ary; Hausdorff, Jeffrey M.

    2003-12-01

    We compare the complexity of human gait time series from healthy subjects under different conditions. Using the recently developed multiscale entropy algorithm, which provides a way to measure complexity over a range of scales, we observe that normal spontaneous walking has the highest complexity when compared to slow and fast walking and also to walking paced by a metronome. These findings have implications for modeling locomotor control and for quantifying gait dynamics in physiologic and pathologic states.

  2. An entropy regularization method applied to the identification of wave distribution function for an ELF hiss event

    NASA Astrophysics Data System (ADS)

    Prot, Olivier; SantolíK, OndřEj; Trotignon, Jean-Gabriel; Deferaudy, Hervé

    2006-06-01

    An entropy regularization algorithm (ERA) has been developed to compute the wave-energy density from electromagnetic field measurements. It is based on the wave distribution function (WDF) concept. To assess its suitability and efficiency, the algorithm is applied to experimental data that has already been analyzed using other inversion techniques. The FREJA satellite data that is used consists of six spectral matrices corresponding to six time-frequency points of an ELF hiss-event spectrogram. The WDF analysis is performed on these six points and the results are compared with those obtained previously. A statistical stability analysis confirms the stability of the solutions. The WDF computation is fast and without any prespecified parameters. The regularization parameter has been chosen in accordance with the Morozov's discrepancy principle. The Generalized Cross Validation and L-curve criterions are then tentatively used to provide a fully data-driven method. However, these criterions fail to determine a suitable value of the regularization parameter. Although the entropy regularization leads to solutions that agree fairly well with those already published, some differences are observed, and these are discussed in detail. The main advantage of the ERA is to return the WDF that exhibits the largest entropy and to avoid the use of a priori models, which sometimes seem to be more accurate but without any justification.

  3. Detection of structural damage in multiwire cables by monitoring the entropy evolution of wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Ibáñez, Flor; Baltazar, Arturo; Mijarez, Rito; Aranda, Jorge

    2015-03-01

    Multiwire cables are widely used in important civil structures. Since they are exposed to several dynamic and static loads, their structural health can be compromised. The cables can also be submitted to mechanical contact, tension and energy propagation in addition to changes in size and material within their wires. Due to the critical role played by multiwire cables, it is necessary to develop a non-destructive health monitoring method to maintain their structure and proper performance. Ultrasonic inspection using guided waves is a promising non-destructive damage monitoring technique for rods, single wires and multiwire cables. The propagated guided waves are composed by an infinite number of vibrational modes making their analysis difficult. In this work, an entropy-based method to identify small changes in non-stationary signals is proposed. A system to capture and post-process acoustic signals is implemented. The Discrete Wavelet Transform (DWT) is computed in order to obtain the reconstructed wavelet coefficients of the signals and to analyze the energy at different scales. The feasibility of using the concept of entropy evolution of non-stationary signals to detect damage in multiwire cables is evaluated. The results show that there is a high correlation between the entropy value and damage level of the cable. The proposed method has low sensitivity to noise and reduces the computational complexity found in a typical time-frequency analysis.

  4. Theory and Normal Mode Analysis of Change in Protein Vibrational Dynamics on Ligand Binding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mortisugu, Kei; Njunda, Brigitte; Smith, Jeremy C

    2009-12-01

    The change of protein vibrations on ligand binding is of functional and thermodynamic importance. Here, this process is characterized using a simple analytical 'ball-and-spring' model and all-atom normal-mode analysis (NMA) of the binding of the cancer drug, methotrexate (MTX) to its target, dihydrofolate reductase (DHFR). The analytical model predicts that the coupling between protein vibrations and ligand external motion generates entropy-rich, low-frequency vibrations in the complex. This is consistent with the atomistic NMA which reveals vibrational softening in forming the DHFR-MTX complex, a result also in qualitative agreement with neutron-scattering experiments. Energy minimization of the atomistic bound-state (B) structure whilemore » gradually decreasing the ligand interaction to zero allows the generation of a hypothetical 'intermediate' (I) state, without the ligand force field but with a structure similar to that of B. In going from I to B, it is found that the vibrational entropies of both the protein and MTX decrease while the complex structure becomes enthalpically stabilized. However, the relatively weak DHFR:MTX interaction energy results in the net entropy gain arising from coupling between the protein and MTX external motion being larger than the loss of vibrational entropy on complex formation. This, together with the I structure being more flexible than the unbound structure, results in the observed vibrational softening on ligand binding.« less

  5. Entropy and equilibrium via games of complexity

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2004-09-01

    It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.

  6. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    PubMed

    Kinoshita, Manabu; Sakai, Mio; Arita, Hideyuki; Shofuda, Tomoko; Chiba, Yasuyoshi; Kagawa, Naoki; Watanabe, Yoshiyuki; Hashimoto, Naoya; Fujimoto, Yasunori; Yoshimine, Toshiki; Nakanishi, Katsuyuki; Kanemura, Yonehiro

    2016-01-01

    Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006) and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73). Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively). ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p < 0.0001, AUC = 0.83, respectively). Finally, IDH1 wild type gliomas showed statistically lower Shannon entropy on T2WI than IDH1 mutated gliomas (p = 0.007) but no difference was observed between IDH1 wild type and mutated gliomas in Edge median values using Prewitt filtering. The current study introduced two image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  7. Monitoring sleepiness with on-board electrophysiological recordings for preventing sleep-deprived traffic accidents.

    PubMed

    Papadelis, Christos; Chen, Zhe; Kourtidou-Papadeli, Chrysoula; Bamidis, Panagiotis D; Chouvarda, Ioanna; Bekiaris, Evangelos; Maglaveras, Nikos

    2007-09-01

    The objective of this study is the development and evaluation of efficient neurophysiological signal statistics, which may assess the driver's alertness level and serve as potential indicators of sleepiness in the design of an on-board countermeasure system. Multichannel EEG, EOG, EMG, and ECG were recorded from sleep-deprived subjects exposed to real field driving conditions. A number of severe driving errors occurred during the experiments. The analysis was performed in two main dimensions: the macroscopic analysis that estimates the on-going temporal evolution of physiological measurements during the driving task, and the microscopic event analysis that focuses on the physiological measurements' alterations just before, during, and after the driving errors. Two independent neurophysiologists visually interpreted the measurements. The EEG data were analyzed by using both linear and non-linear analysis tools. We observed the occurrence of brief paroxysmal bursts of alpha activity and an increased synchrony among EEG channels before the driving errors. The alpha relative band ratio (RBR) significantly increased, and the Cross Approximate Entropy that quantifies the synchrony among channels also significantly decreased before the driving errors. Quantitative EEG analysis revealed significant variations of RBR by driving time in the frequency bands of delta, alpha, beta, and gamma. Most of the estimated EEG statistics, such as the Shannon Entropy, Kullback-Leibler Entropy, Coherence, and Cross-Approximate Entropy, were significantly affected by driving time. We also observed an alteration of eyes blinking duration by increased driving time and a significant increase of eye blinks' number and duration before driving errors. EEG and EOG are promising neurophysiological indicators of driver sleepiness and have the potential of monitoring sleepiness in occupational settings incorporated in a sleepiness countermeasure device. The occurrence of brief paroxysmal bursts of alpha activity before severe driving errors is described in detail for the first time. Clear evidence is presented that eye-blinking statistics are sensitive to the driver's sleepiness and should be considered in the design of an efficient and driver-friendly sleepiness detection countermeasure device.

  8. Vergence variability: a key to understanding oculomotor adaptability?

    PubMed

    Petrock, Annie Marie; Reisman, S; Alvarez, T

    2006-01-01

    Vergence eye movements were recorded from three different populations: healthy young (ages 18-35 years), adaptive presbyopic and non-adaptive presbyopic(the presbyopic groups aged above 45 years) to determine how the variability of the eye movements made by the populations differs. The variability was determined using Shannon Entropy calculations of Wavelet transform coefficients, to yield a non-linear analysis of the vergence movement variability. The data were then fed through a k-means clustering algorithm to classify each subject, with no a priori knowledge of true subject classification. The results indicate a highly significant difference in the total entropy values between the three groups, indicating a difference in the level of information content, and thus hypothetically the oculomotor adaptability, between the three groups.Further, the frequency distribution of the entropy varied across groups.

  9. Using the principle of entropy maximization to infer genetic interaction networks from gene expression patterns.

    PubMed

    Lezon, Timothy R; Banavar, Jayanth R; Cieplak, Marek; Maritan, Amos; Fedoroff, Nina V

    2006-12-12

    We describe a method based on the principle of entropy maximization to identify the gene interaction network with the highest probability of giving rise to experimentally observed transcript profiles. In its simplest form, the method yields the pairwise gene interaction network, but it can also be extended to deduce higher-order interactions. Analysis of microarray data from genes in Saccharomyces cerevisiae chemostat cultures exhibiting energy metabolic oscillations identifies a gene interaction network that reflects the intracellular communication pathways that adjust cellular metabolic activity and cell division to the limiting nutrient conditions that trigger metabolic oscillations. The success of the present approach in extracting meaningful genetic connections suggests that the maximum entropy principle is a useful concept for understanding living systems, as it is for other complex, nonequilibrium systems.

  10. Low-dimensional approximation searching strategy for transfer entropy from non-uniform embedding

    PubMed Central

    2018-01-01

    Transfer entropy from non-uniform embedding is a popular tool for the inference of causal relationships among dynamical subsystems. In this study we present an approach that makes use of low-dimensional conditional mutual information quantities to decompose the original high-dimensional conditional mutual information in the searching procedure of non-uniform embedding for significant variables at different lags. We perform a series of simulation experiments to assess the sensitivity and specificity of our proposed method to demonstrate its advantage compared to previous algorithms. The results provide concrete evidence that low-dimensional approximations can help to improve the statistical accuracy of transfer entropy in multivariate causality analysis and yield a better performance over other methods. The proposed method is especially efficient as the data length grows. PMID:29547669

  11. Study of the cross-market effects of Brexit based on the improved symbolic transfer entropy GARCH model—An empirical analysis of stock–bond correlations

    PubMed Central

    Chen, Xiurong; Zhao, Rubo

    2017-01-01

    In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value. PMID:28817712

  12. Quantile based Tsallis entropy in residual lifetime

    NASA Astrophysics Data System (ADS)

    Khammar, A. H.; Jahanshahi, S. M. A.

    2018-02-01

    Tsallis entropy is a generalization of type α of the Shannon entropy, that is a nonadditive entropy unlike the Shannon entropy. Shannon entropy may be negative for some distributions, but Tsallis entropy can always be made nonnegative by choosing appropriate value of α. In this paper, we derive the quantile form of this nonadditive's entropy function in the residual lifetime, namely the residual quantile Tsallis entropy (RQTE) and get the bounds for it, depending on the Renyi's residual quantile entropy. Also, we obtain relationship between RQTE and concept of proportional hazards model in the quantile setup. Based on the new measure, we propose a stochastic order and aging classes, and study its properties. Finally, we prove characterizations theorems for some well known lifetime distributions. It is shown that RQTE uniquely determines the parent distribution unlike the residual Tsallis entropy.

  13. Assessing the performance of wave breaking parameterizations in shallow waters in spectral wave models

    NASA Astrophysics Data System (ADS)

    Lin, Shangfei; Sheng, Jinyu

    2017-12-01

    Depth-induced wave breaking is the primary dissipation mechanism for ocean surface waves in shallow waters. Different parametrizations were developed for parameterizing depth-induced wave breaking process in ocean surface wave models. The performance of six commonly-used parameterizations in simulating significant wave heights (SWHs) is assessed in this study. The main differences between these six parameterizations are representations of the breaker index and the fraction of breaking waves. Laboratory and field observations consisting of 882 cases from 14 sources of published observational data are used in the assessment. We demonstrate that the six parameterizations have reasonable performance in parameterizing depth-induced wave breaking in shallow waters, but with their own limitations and drawbacks. The widely-used parameterization suggested by Battjes and Janssen (1978, BJ78) has a drawback of underpredicting the SWHs in the locally-generated wave conditions and overpredicting in the remotely-generated wave conditions over flat bottoms. The drawback of BJ78 was addressed by a parameterization suggested by Salmon et al. (2015, SA15). But SA15 had relatively larger errors in SWHs over sloping bottoms than BJ78. We follow SA15 and propose a new parameterization with a dependence of the breaker index on the normalized water depth in deep waters similar to SA15. In shallow waters, the breaker index of the new parameterization has a nonlinear dependence on the local bottom slope rather than the linear dependence used in SA15. Overall, this new parameterization has the best performance with an average scatter index of ∼8.2% in comparison with the three best performing existing parameterizations with the average scatter index between 9.2% and 13.6%.

  14. Time-dependent entropy evolution in microscopic and macroscopic electromagnetic relaxation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker-Jarvis, James

    This paper is a study of entropy and its evolution in the time and frequency domains upon application of electromagnetic fields to materials. An understanding of entropy and its evolution in electromagnetic interactions bridges the boundaries between electromagnetism and thermodynamics. The approach used here is a Liouville-based statistical-mechanical theory. I show that the microscopic entropy is reversible and the macroscopic entropy satisfies an H theorem. The spectral entropy development can be very useful for studying the frequency response of materials. Using a projection-operator based nonequilibrium entropy, different equations are derived for the entropy and entropy production and are applied tomore » the polarization, magnetization, and macroscopic fields. I begin by proving an exact H theorem for the entropy, progress to application of time-dependent entropy in electromagnetics, and then apply the theory to relevant applications in electromagnetics. The paper concludes with a discussion of the relationship of the frequency-domain form of the entropy to the permittivity, permeability, and impedance.« less

  15. Entropy flow and entropy production in the human body in basal conditions.

    PubMed

    Aoki, I

    1989-11-08

    Entropy inflow and outflow for the naked human body in basal conditions in the respiration calorimeter due to infrared radiation, convection, evaporation of water and mass-flow are calculated by use of the energetic data obtained by Hardy & Du Bois. Also, the change of entropy content in the body is estimated. The entropy production in the human body is obtained as the change of entropy content minus the net entropy flow into the body. The entropy production thus calculated becomes positive. The magnitude of entropy production per effective radiating surface area does not show any significant variation with subjects. The entropy production is nearly constant at the calorimeter temperatures of 26-32 degrees C; the average in this temperature range is 0.172 J m-2 sec-1 K-1. The forced air currents around the human body and also clothing have almost no effect in changing the entropy production. Thus, the entropy production of the naked human body in basal conditions does not depend on its environmental factors.

  16. Generalized skew-symmetric interfacial probability distribution in reflectivity and small-angle scattering analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Zhang; Chen, Wei

    Generalized skew-symmetric probability density functions are proposed to model asymmetric interfacial density distributions for the parameterization of any arbitrary density profiles in the `effective-density model'. The penetration of the densities into adjacent layers can be selectively controlled and parameterized. A continuous density profile is generated and discretized into many independent slices of very thin thickness with constant density values and sharp interfaces. The discretized profile can be used to calculate reflectivities via Parratt's recursive formula, or small-angle scattering via the concentric onion model that is also developed in this work.

  17. Parameterizations of the Vertical Variability of Tropical Cirrus Cloud Microphysical and Optical Properties

    NASA Technical Reports Server (NTRS)

    Twohy, Cynthia; Heymsfield, Andrew; Gerber, Hermann

    2005-01-01

    Our multi-investigator effort was targeted at the following areas of interest to CRYSTAL-FACE: (1) the water budgets of anvils, (2) parameterizations of the particle size distributions and related microphysical and optical properties (3) characterizations of the primary ice particle habits, (4) the relationship of the optical properties to the microphysics and particle habits, and (5) investigation of the ice-nuclei types and mechanisms in anvil cirrus. Dr. Twohy's effort focused on (l), (2), and (5), with the measurement and analysis of ice water content and cirrus residual nuclei using the counterflow virtual impactor (CVI).

  18. Generalized skew-symmetric interfacial probability distribution in reflectivity and small-angle scattering analysis

    DOE PAGES

    Jiang, Zhang; Chen, Wei

    2017-11-03

    Generalized skew-symmetric probability density functions are proposed to model asymmetric interfacial density distributions for the parameterization of any arbitrary density profiles in the `effective-density model'. The penetration of the densities into adjacent layers can be selectively controlled and parameterized. A continuous density profile is generated and discretized into many independent slices of very thin thickness with constant density values and sharp interfaces. The discretized profile can be used to calculate reflectivities via Parratt's recursive formula, or small-angle scattering via the concentric onion model that is also developed in this work.

  19. Short-term Wind Forecasting at Wind Farms using WRF-LES and Actuator Disk Model

    NASA Astrophysics Data System (ADS)

    Kirkil, Gokhan

    2017-04-01

    Short-term wind forecasts are obtained for a wind farm on a mountainous terrain using WRF-LES. Multi-scale simulations are also performed using different PBL parameterizations. Turbines are parameterized using Actuator Disc Model. LES models improved the forecasts. Statistical error analysis is performed and ramp events are analyzed. Complex topography of the study area affects model performance, especially the accuracy of wind forecasts were poor for cross valley-mountain flows. By means of LES, we gain new knowledge about the sources of spatial and temporal variability of wind fluctuations such as the configuration of wind turbines.

  20. The response of the SSM/I to the marine environment. Part 2: A parameterization of the effect of the sea surface slope distribution on emission and reflection

    NASA Technical Reports Server (NTRS)

    Petty, Grant W.; Katsaros, Kristina B.

    1994-01-01

    Based on a geometric optics model and the assumption of an isotropic Gaussian surface slope distribution, the component of ocean surface microwave emissivity variation due to large-scale surface roughness is parameterized for the frequencies and approximate viewing angle of the Special Sensor Microwave/Imager. Independent geophysical variables in the parameterization are the effective (microwave frequency dependent) slope variance and the sea surface temperature. Using the same physical model, the change in the effective zenith angle of reflected sky radiation arising from large-scale roughness is also parameterized. Independent geophysical variables in this parameterization are the effective slope variance and the atmospheric optical depth at the frequency in question. Both of the above model-based parameterizations are intended for use in conjunction with empirical parameterizations relating effective slope variance and foam coverage to near-surface wind speed. These empirical parameterizations are the subject of a separate paper.

  1. Uncertainty and feasibility of dynamical downscaling for modeling tropical cyclones for storm surge simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Taraphdar, Sourav; Wang, Taiping

    This paper presents a modeling study conducted to evaluate the uncertainty of a regional model in simulating hurricane wind and pressure fields, and the feasibility of driving coastal storm surge simulation using an ensemble of region model outputs produced by 18 combinations of three convection schemes and six microphysics parameterizations, using Hurricane Katrina as a test case. Simulated wind and pressure fields were compared to observed H*Wind data for Hurricane Katrina and simulated storm surge was compared to observed high-water marks on the northern coast of the Gulf of Mexico. The ensemble modeling analysis demonstrated that the regional model wasmore » able to reproduce the characteristics of Hurricane Katrina with reasonable accuracy and can be used to drive the coastal ocean model for simulating coastal storm surge. Results indicated that the regional model is sensitive to both convection and microphysics parameterizations that simulate moist processes closely linked to the tropical cyclone dynamics that influence hurricane development and intensification. The Zhang and McFarlane (ZM) convection scheme and the Lim and Hong (WDM6) microphysics parameterization are the most skillful in simulating Hurricane Katrina maximum wind speed and central pressure, among the three convection and the six microphysics parameterizations. Error statistics of simulated maximum water levels were calculated for a baseline simulation with H*Wind forcing and the 18 ensemble simulations driven by the regional model outputs. The storm surge model produced the overall best results in simulating the maximum water levels using wind and pressure fields generated with the ZM convection scheme and the WDM6 microphysics parameterization.« less

  2. Impact of different parameterization schemes on simulation of mesoscale convective system over south-east India

    NASA Astrophysics Data System (ADS)

    Madhulatha, A.; Rajeevan, M.

    2018-02-01

    Main objective of the present paper is to examine the role of various parameterization schemes in simulating the evolution of mesoscale convective system (MCS) occurred over south-east India. Using the Weather Research and Forecasting (WRF) model, numerical experiments are conducted by considering various planetary boundary layer, microphysics, and cumulus parameterization schemes. Performances of different schemes are evaluated by examining boundary layer, reflectivity, and precipitation features of MCS using ground-based and satellite observations. Among various physical parameterization schemes, Mellor-Yamada-Janjic (MYJ) boundary layer scheme is able to produce deep boundary layer height by simulating warm temperatures necessary for storm initiation; Thompson (THM) microphysics scheme is capable to simulate the reflectivity by reasonable distribution of different hydrometeors during various stages of system; Betts-Miller-Janjic (BMJ) cumulus scheme is able to capture the precipitation by proper representation of convective instability associated with MCS. Present analysis suggests that MYJ, a local turbulent kinetic energy boundary layer scheme, which accounts strong vertical mixing; THM, a six-class hybrid moment microphysics scheme, which considers number concentration along with mixing ratio of rain hydrometeors; and BMJ, a closure cumulus scheme, which adjusts thermodynamic profiles based on climatological profiles might have contributed for better performance of respective model simulations. Numerical simulation carried out using the above combination of schemes is able to capture storm initiation, propagation, surface variations, thermodynamic structure, and precipitation features reasonably well. This study clearly demonstrates that the simulation of MCS characteristics is highly sensitive to the choice of parameterization schemes.

  3. Alternative ways of using field-based estimates to calibrate ecosystem models and their implications for carbon cycle studies

    USGS Publications Warehouse

    He, Yujie; Zhuang, Qianlai; McGuire, David; Liu, Yaling; Chen, Min

    2013-01-01

    Model-data fusion is a process in which field observations are used to constrain model parameters. How observations are used to constrain parameters has a direct impact on the carbon cycle dynamics simulated by ecosystem models. In this study, we present an evaluation of several options for the use of observations in modeling regional carbon dynamics and explore the implications of those options. We calibrated the Terrestrial Ecosystem Model on a hierarchy of three vegetation classification levels for the Alaskan boreal forest: species level, plant-functional-type level (PFT level), and biome level, and we examined the differences in simulated carbon dynamics. Species-specific field-based estimates were directly used to parameterize the model for species-level simulations, while weighted averages based on species percent cover were used to generate estimates for PFT- and biome-level model parameterization. We found that calibrated key ecosystem process parameters differed substantially among species and overlapped for species that are categorized into different PFTs. Our analysis of parameter sets suggests that the PFT-level parameterizations primarily reflected the dominant species and that functional information of some species were lost from the PFT-level parameterizations. The biome-level parameterization was primarily representative of the needleleaf PFT and lost information on broadleaf species or PFT function. Our results indicate that PFT-level simulations may be potentially representative of the performance of species-level simulations while biome-level simulations may result in biased estimates. Improved theoretical and empirical justifications for grouping species into PFTs or biomes are needed to adequately represent the dynamics of ecosystem functioning and structure.

  4. An adaptive technique to maximize lossless image data compression of satellite images

    NASA Technical Reports Server (NTRS)

    Stewart, Robert J.; Lure, Y. M. Fleming; Liou, C. S. Joe

    1994-01-01

    Data compression will pay an increasingly important role in the storage and transmission of image data within NASA science programs as the Earth Observing System comes into operation. It is important that the science data be preserved at the fidelity the instrument and the satellite communication systems were designed to produce. Lossless compression must therefore be applied, at least, to archive the processed instrument data. In this paper, we present an analysis of the performance of lossless compression techniques and develop an adaptive approach which applied image remapping, feature-based image segmentation to determine regions of similar entropy and high-order arithmetic coding to obtain significant improvements over the use of conventional compression techniques alone. Image remapping is used to transform the original image into a lower entropy state. Several techniques were tested on satellite images including differential pulse code modulation, bi-linear interpolation, and block-based linear predictive coding. The results of these experiments are discussed and trade-offs between computation requirements and entropy reductions are used to identify the optimum approach for a variety of satellite images. Further entropy reduction can be achieved by segmenting the image based on local entropy properties then applying a coding technique which maximizes compression for the region. Experimental results are presented showing the effect of different coding techniques for regions of different entropy. A rule-base is developed through which the technique giving the best compression is selected. The paper concludes that maximum compression can be achieved cost effectively and at acceptable performance rates with a combination of techniques which are selected based on image contextual information.

  5. Characterizing groundwater quality ranks for drinking purposes in Sylhet district, Bangladesh, using entropy method, spatial autocorrelation index, and geostatistics.

    PubMed

    Islam, Abu Reza Md Towfiqul; Ahmed, Nasir; Bodrud-Doza, Md; Chu, Ronghao

    2017-12-01

    Drinking water is susceptible to the poor quality of contaminated water affecting the health of humans. Thus, it is an essential study to investigate factors affecting groundwater quality and its suitability for drinking uses. In this paper, the entropy theory, multivariate statistics, spatial autocorrelation index, and geostatistics are applied to characterize groundwater quality and its spatial variability in the Sylhet district of Bangladesh. A total of 91samples have been collected from wells (e.g., shallow, intermediate, and deep tube wells at 15-300-m depth) from the study area. The results show that NO 3 - , then SO 4 2- , and As are the most contributed parameters influencing the groundwater quality according to the entropy theory. The principal component analysis (PCA) and correlation coefficient also confirm the results of the entropy theory. However, Na + has the highest spatial autocorrelation and the most entropy, thus affecting the groundwater quality. Based on the entropy-weighted water quality index (EWQI) and groundwater quality index (GWQI) classifications, it is observed that 60.45 and 53.86% of water samples are classified as having an excellent to good qualities, while the remaining samples vary from medium to extremely poor quality domains for drinking purposes. Furthermore, the EWQI classification provides the more reasonable results than GWQIs due to its simplicity, accuracy, and ignoring of artificial weight. A Gaussian semivariogram model has been chosen to the best fit model, and groundwater quality indices have a weak spatial dependence, suggesting that both geogenic and anthropogenic factors play a pivotal role in spatial heterogeneity of groundwater quality oscillations.

  6. Quantitative comparison of entropy analysis of fetal heart rate variability related to the different stages of labor.

    PubMed

    Lim, Jongil; Kwon, Ji Young; Song, Juhee; Choi, Hosoon; Shin, Jong Chul; Park, In Yang

    2014-02-01

    The interpretation of the fetal heart rate (FHR) signal considering labor progression may improve perinatal morbidity and mortality. However, there have been few studies that evaluate the fetus in each labor stage quantitatively. To evaluate whether the entropy indices of FHR are different according to labor progression. A retrospective comparative study of FHR recordings in three groups: 280 recordings in the second stage of labor before vaginal delivery, 31 recordings in the first stage of labor before emergency cesarean delivery, and 23 recordings in the pre-labor before elective cesarean delivery. The stored FHR recordings of external cardiotocography during labor. Approximate entropy (ApEn) and sample entropy (SampEn) for the final 2000 RR intervals. The median ApEn and SampEn for the 2000 RR intervals showed the lowest values in the second stage of labor, followed by the emergency cesarean group and the elective cesarean group for all time segments (all P<0.001). Also, in the second stage of labor, the final 5 min of 2000 RR intervals had a significantly lower median ApEn (0.49 vs. 0.44, P=0.001) and lower median SampEn (0.34 vs. 0.29, P<0.001) than the initial 5 min of 2000 RR intervals. Entropy indices of FHR were significantly different according to labor progression. This result supports the necessity of considering labor progression when developing intrapartum fetal monitoring using the entropy indices of FHR. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Spectral entropy analysis of the respiratory signal and its relationship with the cyclic alternating pattern during sleep

    NASA Astrophysics Data System (ADS)

    Reyes-Sanchez, E.; Alba, A.; Méndez, M. O.; Milioli, G.; Parrino, L.

    2016-07-01

    A-phases consist of transient cortical events that normally occur during NREM sleep and can be observed directly in the EEG signals. One particular kind of A-phases, namely, A3-phases are related to arousals from sleep during which increased activity in other systems (such as the cardiovascular and respiratory systems) can also be observed. This study aims to characterize disruptions in the oscillations of the airflow signal during A3-phases of sleep. Spectral entropy was used to quantify the bandwidth of the airflow signal, which under baseline conditions (prior to an A3-phase) resembles a sinusoidal wave with a frequency of about 0.25 Hz and has low spectral entropy values. It was found that during most A3-phases the spectral entropy increases significantly in 70% of the test subjects. These changes occur with higher probability during A3-phases that are longer than 10s, suggesting a delay between the onset of an A3-phase and the effect it has on the respiratory system.

  8. Elastic moduli and thermal expansion coefficients of medium-entropy subsystems of the CrMnFeCoNi high-entropy alloy

    DOE PAGES

    Laplanche, Guillaume; Gadaud, P.; Barsch, C.; ...

    2018-02-23

    Elastic moduli of a set of equiatomic alloys (CrFeCoNi, CrCoNi, CrFeNi, FeCoNi, MnCoNi, MnFeNi, and CoNi), which are medium-entropy subsystems of the CrMnFeCoNi high-entropy alloy were determined as a function of temperature over the range 293 K–1000 K. Thermal expansion coefficients were determined for these alloys over the temperature range 100 K–673 K. All alloys were single-phase and had the face-centered cubic (FCC) crystal structure, except CrFeNi which is a two-phase alloy containing a small amount of body-centered cubic (BCC) precipitates in a FCC matrix. The temperature dependences of thermal expansion coefficients and elastic moduli obtained here are useful formore » quantifying fundamental aspects such as solid solution strengthening, and for structural analysis/design. Furthermore, using the above results, the yield strengths reported in literature for these alloys were normalized by their shear moduli to reveal the influence of shear modulus on solid solution strengthening.« less

  9. Phase Evolution and Mechanical Properties of AlCoCrFeNiSi x High-Entropy Alloys Synthesized by Mechanical Alloying and Spark Plasma Sintering

    NASA Astrophysics Data System (ADS)

    Kumar, Anil; Swarnakar, Akhilesh Kumar; Chopkar, Manoj

    2018-05-01

    In the current investigation, AlCoCrFeNiSi x (x = 0, 0.3, 0.6 and 0.9 in atomic ratio) high-entropy alloy systems are prepared by mechanical alloying and subsequently consolidated by spark plasma sintering. The microstructural and mechanical properties were analyzed to understand the effect of Si addition in AlCoCrFeNi alloy. The x-ray diffraction analysis reveals the supersaturated solid solution of the body-centered cubic structure after 20 h of ball milling. However, the consolidation promotes the transformation of body-centered phases partially into the face-centered cubic structure and sigma phases. A recently proposed geometric model based on the atomic stress theory has been extended for the first time to classify single phase and multi-phases on the high-entropy alloys prepared by mechanical alloying and spark plasma sintering process. Improved microhardness and better wear resistance were achieved as the Si content increased from 0 to 0.9 in the present high-entropy alloy.

  10. The Effects of Aging and Dual Tasking on Human Gait Complexity During Treadmill Walking: A Comparative Study Using Quantized Dynamical Entropy and Sample Entropy.

    PubMed

    Ahmadi, Samira; Wu, Christine; Sepehri, Nariman; Kantikar, Anuprita; Nankar, Mayur; Szturm, Tony

    2018-01-01

    Quantized dynamical entropy (QDE) has recently been proposed as a new measure to quantify the complexity of dynamical systems with the purpose of offering a better computational efficiency. This paper further investigates the viability of this method using five different human gait signals. These signals are recorded while normal walking and while performing secondary tasks among two age groups (young and older age groups). The results are compared with the outcomes of previously established sample entropy (SampEn) measure for the same signals. We also study how analyzing segmented and spatially and temporally normalized signal differs from analyzing whole data. Our findings show that human gait signals become more complex as people age and while they are cognitively loaded. Center of pressure (COP) displacement in mediolateral direction is the best signal for showing the gait changes. Moreover, the results suggest that by segmenting data, more information about intrastride dynamical features are obtained. Most importantly, QDE is shown to be a reliable measure for human gait complexity analysis.

  11. Prediction of microsleeps using pairwise joint entropy and mutual information between EEG channels.

    PubMed

    Baseer, Abdul; Weddell, Stephen J; Jones, Richard D

    2017-07-01

    Microsleeps are involuntary and brief instances of complete loss of responsiveness, typically of 0.5-15 s duration. They adversely affect performance in extended attention-driven jobs and can be fatal. Our aim was to predict microsleeps from 16 channel EEG signals. Two information theoretic concepts - pairwise joint entropy and mutual information - were independently used to continuously extract features from EEG signals. k-nearest neighbor (kNN) with k = 3 was used to calculate both joint entropy and mutual information. Highly correlated features were discarded and the rest were ranked using Fisher score followed by an average of 3-fold cross-validation area under the curve of the receiver operating characteristic (AUC ROC ). Leave-one-out method (LOOM) was performed to test the performance of microsleep prediction system on independent data. The best prediction for 0.25 s ahead was AUCROC, sensitivity, precision, geometric mean (GM), and φ of 0.93, 0.68, 0.33, 0.75, and 0.38 respectively with joint entropy using single linear discriminant analysis (LDA) classifier.

  12. Linear growth of the entanglement entropy and the Kolmogorov-Sinai rate

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Hackl, Lucas; Yokomizo, Nelson

    2018-03-01

    The rate of entropy production in a classical dynamical system is characterized by the Kolmogorov-Sinai entropy rate h KS given by the sum of all positive Lyapunov exponents of the system. We prove a quantum version of this result valid for bosonic systems with unstable quadratic Hamiltonian. The derivation takes into account the case of time-dependent Hamiltonians with Floquet instabilities. We show that the entanglement entropy S A of a Gaussian state grows linearly for large times in unstable systems, with a rate Λ A ≤ h KS determined by the Lyapunov exponents and the choice of the subsystem A. We apply our results to the analysis of entanglement production in unstable quadratic potentials and due to periodic quantum quenches in many-body quantum systems. Our results are relevant for quantum field theory, for which we present three applications: a scalar field in a symmetry-breaking potential, parametric resonance during post-inflationary reheating and cosmological perturbations during inflation. Finally, we conjecture that the same rate Λ A appears in the entanglement growth of chaotic quantum systems prepared in a semiclassical state.

  13. Analysis of swarm behaviors based on an inversion of the fluctuation theorem.

    PubMed

    Hamann, Heiko; Schmickl, Thomas; Crailsheim, Karl

    2014-01-01

    A grand challenge in the field of artificial life is to find a general theory of emergent self-organizing systems. In swarm systems most of the observed complexity is based on motion of simple entities. Similarly, statistical mechanics focuses on collective properties induced by the motion of many interacting particles. In this article we apply methods from statistical mechanics to swarm systems. We try to explain the emergent behavior of a simulated swarm by applying methods based on the fluctuation theorem. Empirical results indicate that swarms are able to produce negative entropy within an isolated subsystem due to frozen accidents. Individuals of a swarm are able to locally detect fluctuations of the global entropy measure and store them, if they are negative entropy productions. By accumulating these stored fluctuations over time the swarm as a whole is producing negative entropy and the system ends up in an ordered state. We claim that this indicates the existence of an inverted fluctuation theorem for emergent self-organizing dissipative systems. This approach bears the potential of general applicability.

  14. Conformational entropic maps of functional coupling domains in GPCR activation: A case study with beta2 adrenergic receptor

    NASA Astrophysics Data System (ADS)

    Liu, Fan; Abrol, Ravinder; Goddard, William, III; Dougherty, Dennis

    2014-03-01

    Entropic effect in GPCR activation is poorly understood. Based on the recent solved structures, researchers in the GPCR structural biology field have proposed several ``local activating switches'' that consisted of a few number of conserved residues, but have long ignored the collective dynamical effect (conformational entropy) of a domain comprised of an ensemble of residues. A new paradigm has been proposed recently that a GPCR can be viewed as a composition of several functional coupling domains, each of which undergoes order-to-disorder or disorder-to-order transitions upon activation. Here we identified and studied these functional coupling domains by comparing the local entropy changes of each residue between the inactive and active states of the β2 adrenergic receptor from computational simulation. We found that agonist and G-protein binding increases the heterogeneity of the entropy distribution in the receptor. This new activation paradigm and computational entropy analysis scheme provides novel ways to design functionally modified mutant and identify new allosteric sites for GPCRs. The authors thank NIH and Sanofi for funding this project.

  15. Dynamical complexity detection in geomagnetic activity indices using wavelet transforms and Tsallis entropy

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Daglis, I. A.; Papadimitriou, C.; Kalimeri, M.; Anastasiadis, A.; Eftaxias, K.

    2008-12-01

    Dynamical complexity detection for output time series of complex systems is one of the foremost problems in physics, biology, engineering, and economic sciences. Especially in magnetospheric physics, accurate detection of the dissimilarity between normal and abnormal states (e.g. pre-storm activity and magnetic storms) can vastly improve space weather diagnosis and, consequently, the mitigation of space weather hazards. Herein, we examine the fractal spectral properties of the Dst data using a wavelet analysis technique. We show that distinct changes in associated scaling parameters occur (i.e., transition from anti- persistent to persistent behavior) as an intense magnetic storm approaches. We then analyze Dst time series by introducing the non-extensive Tsallis entropy, Sq, as an appropriate complexity measure. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). The Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization.

  16. Classification of pulmonary pathology from breath sounds using the wavelet packet transform and an extreme learning machine.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian; Huliraj, N; Revadi, S S

    2017-06-08

    Auscultation is a medical procedure used for the initial diagnosis and assessment of lung and heart diseases. From this perspective, we propose assessing the performance of the extreme learning machine (ELM) classifiers for the diagnosis of pulmonary pathology using breath sounds. Energy and entropy features were extracted from the breath sound using the wavelet packet transform. The statistical significance of the extracted features was evaluated by one-way analysis of variance (ANOVA). The extracted features were inputted into the ELM classifier. The maximum classification accuracies obtained for the conventional validation (CV) of the energy and entropy features were 97.36% and 98.37%, respectively, whereas the accuracies obtained for the cross validation (CRV) of the energy and entropy features were 96.80% and 97.91%, respectively. In addition, maximum classification accuracies of 98.25% and 99.25% were obtained for the CV and CRV of the ensemble features, respectively. The results indicate that the classification accuracy obtained with the ensemble features was higher than those obtained with the energy and entropy features.

  17. Extremal black holes, Stueckelberg scalars and phase transitions

    NASA Astrophysics Data System (ADS)

    Marrani, Alessio; Miskovic, Olivera; Leon, Paula Quezada

    2018-02-01

    We calculate the entropy of a static extremal black hole in 4D gravity, non-linearly coupled to a massive Stueckelberg scalar. We find that the scalar field does not allow the black hole to be magnetically charged. We also show that the system can exhibit a phase transition due to electric charge variations. For spherical and hyperbolic horizons, the critical point exists only in presence of a cosmological constant, and if the scalar is massive and non-linearly coupled to electromagnetic field. On one side of the critical point, two extremal solutions coexist: Reissner-Nordström (A)dS black hole and the charged hairy (A)dS black hole, while on the other side of the critical point the black hole does not have hair. A near-critical analysis reveals that the hairy black hole has larger entropy, thus giving rise to a zero temperature phase transition. This is characterized by a discontinuous second derivative of the entropy with respect to the electric charge at the critical point. The results obtained here are analytical and based on the entropy function formalism and the second law of thermodynamics.

  18. Evaluating Cloud Initialization in a Convection-permit NWP Model

    NASA Astrophysics Data System (ADS)

    Li, Jia; Chen, Baode

    2015-04-01

    In general, to avoid "double counting precipitation" problem, in convection permit NWP models, it was a common practice to turn off convective parameterization. However, if there were not any cloud information in the initial conditions, the occurrence of precipitation could be delayed due to spin-up of cloud field or microphysical variables. In this study, we utilized the complex cloud analysis package from the Advanced Regional Prediction System (ARPS) to adjust the initial states of the model on water substance, such as cloud water, cloud ice, rain water, et al., that is, to initialize the microphysical variables (i.e., hydrometers), mainly based on radar reflectivity observations. Using the Advanced Research WRF (ARW) model, numerical experiments with/without cloud initialization and convective parameterization were carried out at grey-zone resolutions (i.e. 1, 3, and 9 km). The results from the experiments without convective parameterization indicate that model ignition with radar reflectivity can significantly reduce spin-up time and accurately simulate precipitation at the initial time. In addition, it helps to improve location and intensity of predicted precipitation. With grey-zone resolutions (i.e. 1, 3, and 9 km), using the cumulus convective parameterization scheme (without radar data) cannot produce realistic precipitation at the early time. The issues related to microphysical parametrization associated with cloud initialization were also discussed.

  19. Highly parameterized model calibration with cloud computing: an example of regional flow model calibration in northeast Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Hayley, Kevin; Schumacher, J.; MacMillan, G. J.; Boutin, L. C.

    2014-05-01

    Expanding groundwater datasets collected by automated sensors, and improved groundwater databases, have caused a rapid increase in calibration data available for groundwater modeling projects. Improved methods of subsurface characterization have increased the need for model complexity to represent geological and hydrogeological interpretations. The larger calibration datasets and the need for meaningful predictive uncertainty analysis have both increased the degree of parameterization necessary during model calibration. Due to these competing demands, modern groundwater modeling efforts require a massive degree of parallelization in order to remain computationally tractable. A methodology for the calibration of highly parameterized, computationally expensive models using the Amazon EC2 cloud computing service is presented. The calibration of a regional-scale model of groundwater flow in Alberta, Canada, is provided as an example. The model covers a 30,865-km2 domain and includes 28 hydrostratigraphic units. Aquifer properties were calibrated to more than 1,500 static hydraulic head measurements and 10 years of measurements during industrial groundwater use. Three regionally extensive aquifers were parameterized (with spatially variable hydraulic conductivity fields), as was the aerial recharge boundary condition, leading to 450 adjustable parameters in total. The PEST-based model calibration was parallelized on up to 250 computing nodes located on Amazon's EC2 servers.

  20. Using Laboratory Experiments to Improve Ice-Ocean Parameterizations

    NASA Astrophysics Data System (ADS)

    McConnochie, C. D.; Kerr, R. C.

    2017-12-01

    Numerical models of ice-ocean interactions are typically unable to resolve the transport of heat and salt to the ice face. Instead, models rely upon parameterizations that have not been sufficiently validated by observations. Recent laboratory experiments of ice-saltwater interactions allow us to test the standard parameterization of heat and salt transport to ice faces - the three-equation model. The three-equation model predicts that the melt rate is proportional to the fluid velocity while the experimental results typically show that the melt rate is independent of the fluid velocity. By considering an analysis of the boundary layer that forms next to a melting ice face, we suggest a resolution to this disagreement. We show that the three-equation model makes the implicit assumption that the thickness of the diffusive sublayer next to the ice is set by a shear instability. However, at low flow velocities, the sublayer is instead set by a convective instability. This distinction leads to a threshold velocity of approximately 4 cm/s at geophysically relevant conditions, above which the form of the parameterization should be valid. In contrast, at flow speeds below 4 cm/s, the three-equation model will underestimate the melt rate. By incorporating such a minimum velocity into the three-equation model, predictions made by numerical simulations could be easily improved.

Top