Sample records for statistical mechanics based

  1. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  2. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  3. Six new mechanics corresponding to further shape theories

    NASA Astrophysics Data System (ADS)

    Anderson, Edward

    2016-02-01

    In this paper, suite of relational notions of shape are presented at the level of configuration space geometry, with corresponding new theories of shape mechanics and shape statistics. These further generalize two quite well known examples: (i) Kendall’s (metric) shape space with his shape statistics and Barbour’s mechanics thereupon. (ii) Leibnizian relational space alias metric scale-and-shape space to which corresponds Barbour-Bertotti mechanics. This paper’s new theories include, using the invariant and group namings, (iii) Angle alias conformal shape mechanics. (iv) Area ratio alias e shape mechanics. (v) Area alias e scale-and-shape mechanics. (iii)-(v) rest respectively on angle space, area-ratio space, and area space configuration spaces. Probability and statistics applications are also pointed to in outline. (vi) Various supersymmetric counterparts of (i)-(v) are considered. Since supergravity differs considerably from GR-based conceptions of background independence, some of the new supersymmetric shape mechanics are compared with both. These reveal compatibility between supersymmetry and GR-based conceptions of background independence, at least within these simpler model arenas.

  4. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  5. Image compression system and method having optimized quantization tables

    NASA Technical Reports Server (NTRS)

    Ratnakar, Viresh (Inventor); Livny, Miron (Inventor)

    1998-01-01

    A digital image compression preprocessor for use in a discrete cosine transform-based digital image compression device is provided. The preprocessor includes a gathering mechanism for determining discrete cosine transform statistics from input digital image data. A computing mechanism is operatively coupled to the gathering mechanism to calculate a image distortion array and a rate of image compression array based upon the discrete cosine transform statistics for each possible quantization value. A dynamic programming mechanism is operatively coupled to the computing mechanism to optimize the rate of image compression array against the image distortion array such that a rate-distortion-optimal quantization table is derived. In addition, a discrete cosine transform-based digital image compression device and a discrete cosine transform-based digital image compression and decompression system are provided. Also, a method for generating a rate-distortion-optimal quantization table, using discrete cosine transform-based digital image compression, and operating a discrete cosine transform-based digital image compression and decompression system are provided.

  6. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  7. Physical concepts in the development of constitutive equations

    NASA Technical Reports Server (NTRS)

    Cassenti, B. N.

    1985-01-01

    Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.

  8. Statistics on gene-based laser speckles with a small number of scatterers: implications for the detection of polymorphism in the Chlamydia trachomatis omp1 gene

    NASA Astrophysics Data System (ADS)

    Ulyanov, Sergey S.; Ulianova, Onega V.; Zaytsev, Sergey S.; Saltykov, Yury V.; Feodorova, Valentina A.

    2018-04-01

    The transformation mechanism for a nucleotide sequence of the Chlamydia trachomatis gene into a speckle pattern has been considered. The first and second-order statistics of gene-based speckles have been analyzed. It has been demonstrated that gene-based speckles do not obey Gaussian statistics and belong to the class of speckles with a small number of scatterers. It has been shown that gene polymorphism can be easily detected through analysis of the statistical characteristics of gene-based speckles.

  9. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  10. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less

  11. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.

    PubMed

    Sinitskiy, Anton V; Voth, Gregory A

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.

  12. Capture approximations beyond a statistical quantum mechanical method for atom-diatom reactions

    NASA Astrophysics Data System (ADS)

    Barrios, Lizandra; Rubayo-Soneira, Jesús; González-Lezana, Tomás

    2016-03-01

    Statistical techniques constitute useful approaches to investigate atom-diatom reactions mediated by insertion dynamics which involves complex-forming mechanisms. Different capture schemes based on energy considerations regarding the specific diatom rovibrational states are suggested to evaluate the corresponding probabilities of formation of such collision species between reactants and products in an attempt to test reliable alternatives for computationally demanding processes. These approximations are tested in combination with a statistical quantum mechanical method for the S + H2(v = 0 ,j = 1) → SH + H and Si + O2(v = 0 ,j = 1) → SiO + O reactions, where this dynamical mechanism plays a significant role, in order to probe their validity.

  13. Statistical Thermodynamics and Microscale Thermophysics

    NASA Astrophysics Data System (ADS)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  14. Integrating Statistical Mechanics with Experimental Data from the Rotational-Vibrational Spectrum of HCl into the Physical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Findley, Bret R.; Mylon, Steven E.

    2008-01-01

    We introduce a computer exercise that bridges spectroscopy and thermodynamics using statistical mechanics and the experimental data taken from the commonly used laboratory exercise involving the rotational-vibrational spectrum of HCl. Based on the results from the analysis of their HCl spectrum, students calculate bulk thermodynamic properties…

  15. Statistical Mechanics of Prion Diseases

    NASA Astrophysics Data System (ADS)

    Slepoy, A.; Singh, R. R.; Pázmándi, F.; Kulkarni, R. V.; Cox, D. L.

    2001-07-01

    We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ``species barriers'' to prion infection and assess a related treatment protocol.

  16. Parametric Analysis to Study the Influence of Aerogel-Based Renders' Components on Thermal and Mechanical Performance.

    PubMed

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-05-04

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.

  17. Parametric Analysis to Study the Influence of Aerogel-Based Renders’ Components on Thermal and Mechanical Performance

    PubMed Central

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-01-01

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460

  18. Realistic thermodynamic and statistical-mechanical measures for neural synchronization.

    PubMed

    Kim, Sang-Yoon; Lim, Woochang

    2014-04-15

    Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.

  19. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  20. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  1. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  2. Complete integrability of information processing by biochemical reactions

    PubMed Central

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-01-01

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling – based on spin systems – has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis–Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy – based on completely integrable hydrodynamic-type systems of PDEs – which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions. PMID:27812018

  3. Complete integrability of information processing by biochemical reactions

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-11-01

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.

  4. Complete integrability of information processing by biochemical reactions.

    PubMed

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-11-04

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.

  5. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  6. Systems Models for Transportation Problems : Volume 1. Introducing a Systems Science for Transportation Planning.

    DOT National Transportation Integrated Search

    1976-03-01

    This introductory portion of a system science for tranportation planning, which is based on the statistical physics of ensembles, a foundations laid on how statistical mechanics, equilibrium thermodynamics, and near equilbrium thermodynamics can be u...

  7. Estimation of elastic moduli of graphene monolayer in lattice statics approach at nonzero temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zubko, I. Yu., E-mail: zoubko@list.ru; Kochurov, V. I.

    2015-10-27

    For the aim of the crystal temperature control the computational-statistical approach to studying thermo-mechanical properties for finite sized crystals is presented. The approach is based on the combination of the high-performance computational techniques and statistical analysis of the crystal response on external thermo-mechanical actions for specimens with the statistically small amount of atoms (for instance, nanoparticles). The heat motion of atoms is imitated in the statics approach by including the independent degrees of freedom for atoms connected with their oscillations. We obtained that under heating, graphene material response is nonsymmetric.

  8. Nonadditive entropy Sq and nonextensive statistical mechanics: Applications in geophysics and elsewhere

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2012-06-01

    The celebrated Boltzmann-Gibbs (BG) entropy, S BG = -kΣi p i ln p i, and associated statistical mechanics are essentially based on hypotheses such as ergodicity, i.e., when ensemble averages coincide with time averages. This dynamical simplification occurs in classical systems (and quantum counterparts) whose microscopic evolution is governed by a positive largest Lyapunov exponent (LLE). Under such circumstances, relevant microscopic variables behave, from the probabilistic viewpoint, as (nearly) independent. Many phenomena exist, however, in natural, artificial and social systems (geophysics, astrophysics, biophysics, economics, and others) that violate ergodicity. To cover a (possibly) wide class of such systems, a generalization (nonextensive statistical mechanics) of the BG theory was proposed in 1988. This theory is based on nonadditive entropies such as S_q = kfrac{{1 - sumnolimits_i {p_i^q } }} {{q - 1}}left( {S_1 = S_{BG} } right). Here we comment some central aspects of this theory, and briefly review typical predictions, verifications and applications in geophysics and elsewhere, as illustrated through theoretical, experimental, observational, and computational results.

  9. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  10. Unified risk analysis of fatigue failure in ductile alloy components during all three stages of fatigue crack evolution process.

    PubMed

    Patankar, Ravindra

    2003-10-01

    Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.

  11. Statistical State Dynamics Based Study of the Role of Nonlinearity in the Maintenance of Turbulence in Couette Flow

    NASA Astrophysics Data System (ADS)

    Farrell, Brian; Ioannou, Petros; Nikolaidis, Marios-Andreas

    2017-11-01

    While linear non-normality underlies the mechanism of energy transfer from the externally driven flow to the perturbation field, nonlinearity is also known to play an essential role in sustaining turbulence. We report a study based on the statistical state dynamics of Couette flow turbulence with the goal of better understanding the role of nonlinearity in sustaining turbulence. The statistical state dynamics implementations used are ensemble closures at second order in a cumulant expansion of the Navier-Stokes equations in which the averaging operator is the streamwise mean. Two fundamentally non-normal mechanisms potentially contributing to maintaining the second cumulant are identified. These are essentially parametric perturbation growth arising from interaction of the perturbations with the fluctuating mean flow and transient growth of perturbations arising from nonlinear interaction between components of the perturbation field. By the method of selectively including these mechanisms parametric growth is found to maintain the perturbation field in the turbulent state while the more commonly invoked mechanism associated with transient growth of perturbations arising from scattering by nonlinear interaction is found to suppress perturbation variance. Funded by ERC Coturb Madrid Summer Program and NSF AGS-1246929.

  12. New statistical potential for quality assessment of protein models and a survey of energy functions

    PubMed Central

    2010-01-01

    Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048

  13. 47 CFR 54.5 - Terms and definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., address translation, protocol conversion, billing management, introductory information content, and... 1990s and identifiable from the most recent Metropolitan Statistical Area (MSA) list released by OMB, or... support mechanism, a “rural area” is an area that is entirely outside of a Core Based Statistical Area; is...

  14. Stochastical modeling for Viral Disease: Statistical Mechanics and Network Theory

    NASA Astrophysics Data System (ADS)

    Zhou, Hao; Deem, Michael

    2007-04-01

    Theoretical methods of statistical mechanics are developed and applied to study the immunological response against viral disease, such as dengue. We use this theory to show how the immune response to four different dengue serotypes may be sculpted. It is the ability of avian influenza, to change and to mix, that has given rise to the fear of a new human flu pandemic. Here we propose to utilize a scale free network based stochastic model to investigate the mitigation strategies and analyze the risk.

  15. Mechanism-based Pharmacovigilance over the Life Sciences Linked Open Data Cloud.

    PubMed

    Kamdar, Maulik R; Musen, Mark A

    2017-01-01

    Adverse drug reactions (ADR) result in significant morbidity and mortality in patients, and a substantial proportion of these ADRs are caused by drug-drug interactions (DDIs). Pharmacovigilance methods are used to detect unanticipated DDIs and ADRs by mining Spontaneous Reporting Systems, such as the US FDA Adverse Event Reporting System (FAERS). However, these methods do not provide mechanistic explanations for the discovered drug-ADR associations in a systematic manner. In this paper, we present a systems pharmacology-based approach to perform mechanism-based pharmacovigilance. We integrate data and knowledge from four different sources using Semantic Web Technologies and Linked Data principles to generate a systems network. We present a network-based Apriori algorithm for association mining in FAERS reports. We evaluate our method against existing pharmacovigilance methods for three different validation sets. Our method has AUROC statistics of 0.7-0.8, similar to current methods, and event-specific thresholds generate AUROC statistics greater than 0.75 for certain ADRs. Finally, we discuss the benefits of using Semantic Web technologies to attain the objectives for mechanism-based pharmacovigilance.

  16. A novel conductivity mechanism of highly disordered carbon systems based on an investigation of graph zeta function

    NASA Astrophysics Data System (ADS)

    Matsutani, Shigeki; Sato, Iwao

    2017-09-01

    In the previous report (Matsutani and Suzuki, 2000 [21]), by proposing the mechanism under which electric conductivity is caused by the activational hopping conduction with the Wigner surmise of the level statistics, the temperature-dependent of electronic conductivity of a highly disordered carbon system was evaluated including apparent metal-insulator transition. Since the system consists of small pieces of graphite, it was assumed that the reason why the level statistics appears is due to the behavior of the quantum chaos in each granular graphite. In this article, we revise the assumption and show another origin of the Wigner surmise, which is more natural for the carbon system based on a recent investigation of graph zeta function in graph theory. Our method can be applied to the statistical treatment of the electronic properties of the randomized molecular system in general.

  17. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.

    PubMed

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-10-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.

  18. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES

    PubMed Central

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-01-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512

  19. Edwards statistical mechanics for jammed granular matter

    NASA Astrophysics Data System (ADS)

    Baule, Adrian; Morone, Flaviano; Herrmann, Hans J.; Makse, Hernán A.

    2018-01-01

    In 1989, Sir Sam Edwards made the visionary proposition to treat jammed granular materials using a volume ensemble of equiprobable jammed states in analogy to thermal equilibrium statistical mechanics, despite their inherent athermal features. Since then, the statistical mechanics approach for jammed matter—one of the very few generalizations of Gibbs-Boltzmann statistical mechanics to out-of-equilibrium matter—has garnered an extraordinary amount of attention by both theorists and experimentalists. Its importance stems from the fact that jammed states of matter are ubiquitous in nature appearing in a broad range of granular and soft materials such as colloids, emulsions, glasses, and biomatter. Indeed, despite being one of the simplest states of matter—primarily governed by the steric interactions between the constitutive particles—a theoretical understanding based on first principles has proved exceedingly challenging. Here a systematic approach to jammed matter based on the Edwards statistical mechanical ensemble is reviewed. The construction of microcanonical and canonical ensembles based on the volume function, which replaces the Hamiltonian in jammed systems, is discussed. The importance of approximation schemes at various levels is emphasized leading to quantitative predictions for ensemble averaged quantities such as packing fractions and contact force distributions. An overview of the phenomenology of jammed states and experiments, simulations, and theoretical models scrutinizing the strong assumptions underlying Edwards approach is given including recent results suggesting the validity of Edwards ergodic hypothesis for jammed states. A theoretical framework for packings whose constitutive particles range from spherical to nonspherical shapes such as dimers, polymers, ellipsoids, spherocylinders or tetrahedra, hard and soft, frictional, frictionless and adhesive, monodisperse, and polydisperse particles in any dimensions is discussed providing insight into a unifying phase diagram for all jammed matter. Furthermore, the connection between the Edwards ensemble of metastable jammed states and metastability in spin glasses is established. This highlights the fact that the packing problem can be understood as a constraint satisfaction problem for excluded volume and force and torque balance leading to a unifying framework between the Edwards ensemble of equiprobable jammed states and out-of-equilibrium spin glasses.

  20. Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Sugiyama, Masaru

    . Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.

  1. Version pressure feedback mechanisms for speculative versioning caches

    DOEpatents

    Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong

    2013-03-12

    Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.

  2. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation.

    PubMed

    Finnerty, Justin John; Peyser, Alexander; Carloni, Paolo

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores.

  3. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  4. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  5. Statistical mechanics of protein structural transitions: Insights from the island model

    PubMed Central

    Kobayashi, Yukio

    2016-01-01

    The so-called island model of protein structural transition holds that hydrophobic interactions are the key to both the folding and function of proteins. Herein, the genesis and statistical mechanical basis of the island model of transitions are reviewed, by presenting the results of simulations of such transitions. Elucidating the physicochemical mechanism of protein structural formation is the foundation for understanding the hierarchical structure of life at the microscopic level. Based on the results obtained to date using the island model, remaining problems and future work in the field of protein structures are discussed, referencing Professor Saitô’s views on the hierarchic structure of science. PMID:28409078

  6. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  7. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    PubMed Central

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-01-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity). PMID:25976626

  8. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity.

    PubMed

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-15

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  9. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  10. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    PubMed

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  11. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    PubMed

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  12. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  13. Proceedings for the Annual Symposium and Exhibition on Situational Awareness in the Tactical Air Environment, (2nd), Held at Patuxent River, Maryland, on 3-4 June 1997

    DTIC Science & Technology

    1997-06-01

    made based on a learning mechanism. Traditional statistical regression and neural network approaches offer some utility, but suffer from practical...Columbus, OH. Kraiger, K., Ford, J. K., & Salas, E. (1993). Application of cognitive, skill- based , and affective theories of learning outcomes to new...and Feature Effects 151 Enhanced Spatial State Feedback for Night Vision Goggle Displays 159 Statistical Network Applications of Decision Aiding for

  14. Confounding in statistical mediation analysis: What it is and how to address it.

    PubMed

    Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P

    2017-11-01

    Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Mechanical Impact Testing: A Statistical Measurement

    NASA Technical Reports Server (NTRS)

    Engel, Carl D.; Herald, Stephen D.; Davis, S. Eddie

    2005-01-01

    In the decades since the 1950s, when NASA first developed mechanical impact testing of materials, researchers have continued efforts to gain a better understanding of the chemical, mechanical, and thermodynamic nature of the phenomenon. The impact mechanism is a real combustion ignition mechanism that needs understanding in the design of an oxygen system. The use of test data from this test method has been questioned due to lack of a clear method of application of the data and variability found between tests, material batches, and facilities. This effort explores a large database that has accumulated over a number of years and explores its overall nature. Moreover, testing was performed to determine the statistical nature of the test procedure to help establish sample size guidelines for material characterization. The current method of determining a pass/fail criterion based on either light emission or sound report or material charring is questioned.

  16. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation

    PubMed Central

    Finnerty, Justin John

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores. PMID:26460827

  17. Statistical Mechanics of the Delayed Reward-Based Learning with Node Perturbation

    NASA Astrophysics Data System (ADS)

    Hiroshi Saito,; Kentaro Katahira,; Kazuo Okanoya,; Masato Okada,

    2010-06-01

    In reward-based learning, reward is typically given with some delay after a behavior that causes the reward. In machine learning literature, the framework of the eligibility trace has been used as one of the solutions to handle the delayed reward in reinforcement learning. In recent studies, the eligibility trace is implied to be important for difficult neuroscience problem known as the “distal reward problem”. Node perturbation is one of the stochastic gradient methods from among many kinds of reinforcement learning implementations, and it searches the approximate gradient by introducing perturbation to a network. Since the stochastic gradient method does not require a objective function differential, it is expected to be able to account for the learning mechanism of a complex system, like a brain. We study the node perturbation with the eligibility trace as a specific example of delayed reward-based learning, and analyzed it using a statistical mechanics approach. As a result, we show the optimal time constant of the eligibility trace respect to the reward delay and the existence of unlearnable parameter configurations.

  18. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  19. Statistical analysis of Geopotential Height (GH) timeseries based on Tsallis non-extensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.

    2018-02-01

    In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.

  20. Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Ohzeki, Masayuki

    2013-09-01

    In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.

  1. On modelling the interaction between two rotating bodies with statistically distributed features: an application to dressing of grinding wheels

    NASA Astrophysics Data System (ADS)

    Spampinato, A.; Axinte, D. A.

    2017-12-01

    The mechanisms of interaction between bodies with statistically arranged features present characteristics common to different abrasive processes, such as dressing of abrasive tools. In contrast with the current empirical approach used to estimate the results of operations based on attritive interactions, the method we present in this paper allows us to predict the output forces and the topography of a simulated grinding wheel for a set of specific operational parameters (speed ratio and radial feed-rate), providing a thorough understanding of the complex mechanisms regulating these processes. In modelling the dressing mechanisms, the abrasive characteristics of both bodies (grain size, geometry, inter-space and protrusion) are first simulated; thus, their interaction is simulated in terms of grain collisions. Exploiting a specifically designed contact/impact evaluation algorithm, the model simulates the collisional effects of the dresser abrasives on the grinding wheel topography (grain fracture/break-out). The method has been tested for the case of a diamond rotary dresser, predicting output forces within less than 10% error and obtaining experimentally validated grinding wheel topographies. The study provides a fundamental understanding of the dressing operation, enabling the improvement of its performance in an industrial scenario, while being of general interest in modelling collision-based processes involving statistically distributed elements.

  2. PREFACE: ELC International Meeting on Inference, Computation, and Spin Glasses (ICSG2013)

    NASA Astrophysics Data System (ADS)

    Kabashima, Yoshiyuki; Hukushima, Koji; Inoue, Jun-ichi; Tanaka, Toshiyuki; Watanabe, Osamu

    2013-12-01

    The close relationship between probability-based inference and statistical mechanics of disordered systems has been noted for some time. This relationship has provided researchers with a theoretical foundation in various fields of information processing for analytical performance evaluation and construction of efficient algorithms based on message-passing or Monte Carlo sampling schemes. The ELC International Meeting on 'Inference, Computation, and Spin Glasses (ICSG2013)', was held in Sapporo 28-30 July 2013. The meeting was organized as a satellite meeting of STATPHYS25 in order to offer a forum where concerned researchers can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies between statistical mechanics and information sciences. Financial support from Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan 'Exploring the Limits of Computation (ELC)' is gratefully acknowledged. We are pleased to publish 23 papers contributed by invited speakers of ICSG2013 in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of this highly vigorous interdisciplinary field between statistical mechanics and information/computer science. Editors and ICSG2013 Organizing Committee: Koji Hukushima Jun-ichi Inoue (Local Chair of ICSG2013) Yoshiyuki Kabashima (Editor-in-Chief) Toshiyuki Tanaka Osamu Watanabe (General Chair of ICSG2013)

  3. The maximum entropy production principle: two basic questions.

    PubMed

    Martyushev, Leonid M

    2010-05-12

    The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.

  4. Quantum mechanics: why complex Hilbert space?

    NASA Astrophysics Data System (ADS)

    Cassinelli, G.; Lahti, P.

    2017-10-01

    We outline a programme for an axiomatic reconstruction of quantum mechanics based on the statistical duality of states and effects that combines the use of a theorem of Solér with the idea of symmetry. We also discuss arguments favouring the choice of the complex field. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  5. Mechanical properties of silicate glasses exposed to a low-Earth orbit

    NASA Technical Reports Server (NTRS)

    Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.

    1992-01-01

    The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.

  6. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students’ Statistical Reasoning and Quantitative Literacy Skills †

    PubMed Central

    Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549

  7. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students' Statistical Reasoning and Quantitative Literacy Skills.

    PubMed

    Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.

  8. Pseudocontingencies and Choice Behavior in Probabilistic Environments with Context-Dependent Outcomes

    ERIC Educational Resources Information Center

    Meiser, Thorsten; Rummel, Jan; Fleig, Hanna

    2018-01-01

    Pseudocontingencies are inferences about correlations in the environment that are formed on the basis of statistical regularities like skewed base rates or varying base rates across environmental contexts. Previous research has demonstrated that pseudocontingencies provide a pervasive mechanism of inductive inference in numerous social judgment…

  9. Linking Mechanics and Statistics in Epidermal Tissues

    NASA Astrophysics Data System (ADS)

    Kim, Sangwoo; Hilgenfeldt, Sascha

    2015-03-01

    Disordered cellular structures, such as foams, polycrystals, or living tissues, can be characterized by quantitative measurements of domain size and topology. In recent work, we showed that correlations between size and topology in 2D systems are sensitive to the shape (eccentricity) of the individual domains: From a local model of neighbor relations, we derived an analytical justification for the famous empirical Lewis law, confirming the theory with experimental data from cucumber epidermal tissue. Here, we go beyond this purely geometrical model and identify mechanical properties of the tissue as the root cause for the domain eccentricity and thus the statistics of tissue structure. The simple model approach is based on the minimization of an interfacial energy functional. Simulations with Surface Evolver show that the domain statistics depend on a single mechanical parameter, while parameter fluctuations from cell to cell play an important role in simultaneously explaining the shape distribution of cells. The simulations are in excellent agreement with experiments and analytical theory, and establish a general link between the mechanical properties of a tissue and its structure. The model is relevant to diagnostic applications in a variety of animal and plant tissues.

  10. Quantum mechanics: why complex Hilbert space?

    PubMed

    Cassinelli, G; Lahti, P

    2017-11-13

    We outline a programme for an axiomatic reconstruction of quantum mechanics based on the statistical duality of states and effects that combines the use of a theorem of Solér with the idea of symmetry. We also discuss arguments favouring the choice of the complex field.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).

  11. James Clerk Maxwell and the Kinetic Theory of Gases: A Review Based on Recent Historical Studies

    ERIC Educational Resources Information Center

    Brush, Stephen G.

    1971-01-01

    Maxwell's four major papers and some shorter publications relating to kinetic theory and statistical mechanics are discussed in the light of subsequent research. Reviews Maxwell's ideas on such topics as velocity, distribution law, the theory of heat conduction, the mechanism of the radiometer effect, the ergodic hypothesis, and his views on the…

  12. cgDNAweb: a web interface to the cgDNA sequence-dependent coarse-grain model of double-stranded DNA.

    PubMed

    De Bruin, Lennart; Maddocks, John H

    2018-06-14

    The sequence-dependent statistical mechanical properties of fragments of double-stranded DNA is believed to be pertinent to its biological function at length scales from a few base pairs (or bp) to a few hundreds of bp, e.g. indirect read-out protein binding sites, nucleosome positioning sequences, phased A-tracts, etc. In turn, the equilibrium statistical mechanics behaviour of DNA depends upon its ground state configuration, or minimum free energy shape, as well as on its fluctuations as governed by its stiffness (in an appropriate sense). We here present cgDNAweb, which provides browser-based interactive visualization of the sequence-dependent ground states of double-stranded DNA molecules, as predicted by the underlying cgDNA coarse-grain rigid-base model of fragments with arbitrary sequence. The cgDNAweb interface is specifically designed to facilitate comparison between ground state shapes of different sequences. The server is freely available at cgDNAweb.epfl.ch with no login requirement.

  13. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Floros, D

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less

  14. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    PubMed

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  15. Simulating biochemical physics with computers: 1. Enzyme catalysis by phosphotriesterase and phosphodiesterase; 2. Integration-free path-integral method for quantum-statistical calculations

    NASA Astrophysics Data System (ADS)

    Wong, Kin-Yiu

    We have simulated two enzymatic reactions with molecular dynamics (MD) and combined quantum mechanical/molecular mechanical (QM/MM) techniques. One reaction is the hydrolysis of the insecticide paraoxon catalyzed by phosphotriesterase (PTE). PTE is a bioremediation candidate for environments contaminated by toxic nerve gases (e.g., sarin) or pesticides. Based on the potential of mean force (PMF) and the structural changes of the active site during the catalysis, we propose a revised reaction mechanism for PTE. Another reaction is the hydrolysis of the second-messenger cyclic adenosine 3'-5'-monophosphate (cAMP) catalyzed by phosphodiesterase (PDE). Cyclicnucleotide PDE is a vital protein in signal-transduction pathways and thus a popular target for inhibition by drugs (e.g., ViagraRTM). A two-dimensional (2-D) free-energy profile has been generated showing that the catalysis by PDE proceeds in a two-step SN2-type mechanism. Furthermore, to characterize a chemical reaction mechanism in experiment, a direct probe is measuring kinetic isotope effects (KIEs). KIEs primarily arise from internuclear quantum-statistical effects, e.g., quantum tunneling and quantization of vibration. To systematically incorporate the quantum-statistical effects during MD simulations, we have developed an automated integration-free path-integral (AIF-PI) method based on Kleinert's variational perturbation theory for the centroid density of Feynman's path integral. Using this analytic method, we have performed ab initio pathintegral calculations to study the origin of KIEs on several series of proton-transfer reactions from carboxylic acids to aryl substituted alpha-methoxystyrenes in water. In addition, we also demonstrate that the AIF-PI method can be used to systematically compute the exact value of zero-point energy (beyond the harmonic approximation) by simply minimizing the centroid effective potential.

  16. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor.

    PubMed

    Mansour, Joseph M; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D; Liu, Yiying; Welter, Jean F

    2014-10-01

    Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. The statistical model generally predicted the Young's moduli in compression to within <10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor.

  17. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor

    PubMed Central

    Mansour, Joseph M.; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D.; Liu, Yiying; Welter, Jean F.

    2016-01-01

    Introduction Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Methods Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. Results The statistical model generally predicted the Young's moduli in compression to within < 10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Conclusions Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor. PMID:25092421

  18. Markov Random Fields, Stochastic Quantization and Image Analysis

    DTIC Science & Technology

    1990-01-01

    Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.

  19. Prediction of Chemical Function: Model Development and Application

    EPA Science Inventory

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...

  20. Theory-based Bayesian Models of Inductive Inference

    DTIC Science & Technology

    2010-07-19

    Subjective randomness and natural scene statistics. Psychonomic Bulletin & Review . http://cocosci.berkeley.edu/tom/papers/randscenes.pdf Page 1...in press). Exemplar models as a mechanism for performing Bayesian inference. Psychonomic Bulletin & Review . http://cocosci.berkeley.edu/tom

  1. The energetic cost of walking: a comparison of predictive methods.

    PubMed

    Kramer, Patricia Ann; Sylvester, Adam D

    2011-01-01

    The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is "best", but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species.

  2. Irradiation defect dispersions and effective dislocation mobility in strained ferritic grains: A statistical analysis based on 3D dislocation dynamics simulations

    NASA Astrophysics Data System (ADS)

    Li, Y.; Robertson, C.

    2018-06-01

    The influence of irradiation defect dispersions on plastic strain spreading is investigated by means of three-dimensional dislocation dynamics (DD) simulations, accounting for thermally activated slip and cross-slip mechanisms in Fe-2.5%Cr grains. The defect-induced evolutions of the effective screw dislocation mobility are evaluated by means of statistical comparisons, for various defect number density and defect size cases. Each comparison is systematically associated with a quantitative Defect-Induced Apparent Straining Temperature shift (or «ΔDIAT»), calculated without any adjustable parameters. In the investigated cases, the ΔDIAT level associated with a given defect dispersion closely replicates the measured ductile to brittle transition temperature shift (ΔDBTT) due to the same, actual defect dispersion. The results are further analyzed in terms of dislocation-based plasticity mechanisms and their possible relations with the dose-dependent changes of the ductile to brittle transition temperature.

  3. Interpreting support vector machine models for multivariate group wise analysis in neuroimaging

    PubMed Central

    Gaonkar, Bilwaj; Shinohara, Russell T; Davatzikos, Christos

    2015-01-01

    Machine learning based classification algorithms like support vector machines (SVMs) have shown great promise for turning a high dimensional neuroimaging data into clinically useful decision criteria. However, tracing imaging based patterns that contribute significantly to classifier decisions remains an open problem. This is an issue of critical importance in imaging studies seeking to determine which anatomical or physiological imaging features contribute to the classifier’s decision, thereby allowing users to critically evaluate the findings of such machine learning methods and to understand disease mechanisms. The majority of published work addresses the question of statistical inference for support vector classification using permutation tests based on SVM weight vectors. Such permutation testing ignores the SVM margin, which is critical in SVM theory. In this work we emphasize the use of a statistic that explicitly accounts for the SVM margin and show that the null distributions associated with this statistic are asymptotically normal. Further, our experiments show that this statistic is a lot less conservative as compared to weight based permutation tests and yet specific enough to tease out multivariate patterns in the data. Thus, we can better understand the multivariate patterns that the SVM uses for neuroimaging based classification. PMID:26210913

  4. Statistical mechanics of soft-boson phase transitions

    NASA Technical Reports Server (NTRS)

    Gupta, Arun K.; Hill, Christopher T.; Holman, Richard; Kolb, Edward W.

    1991-01-01

    The existence of structure on large (100 Mpc) scales, and limits to anisotropies in the cosmic microwave background radiation (CMBR), have imperiled models of structure formation based solely upon the standard cold dark matter scenario. Novel scenarios, which may be compatible with large scale structure and small CMBR anisotropies, invoke nonlinear fluctuations in the density appearing after recombination, accomplished via the use of late time phase transitions involving ultralow mass scalar bosons. Herein, the statistical mechanics are studied of such phase transitions in several models involving naturally ultralow mass pseudo-Nambu-Goldstone bosons (pNGB's). These models can exhibit several interesting effects at high temperature, which is believed to be the most general possibilities for pNGB's.

  5. Statistical Analysis of Crystallization Database Links Protein Physico-Chemical Features with Crystallization Mechanisms

    PubMed Central

    Fusco, Diana; Barnum, Timothy J.; Bruno, Andrew E.; Luft, Joseph R.; Snell, Edward H.; Mukherjee, Sayan; Charbonneau, Patrick

    2014-01-01

    X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis. PMID:24988076

  6. Statistical analysis of crystallization database links protein physico-chemical features with crystallization mechanisms.

    PubMed

    Fusco, Diana; Barnum, Timothy J; Bruno, Andrew E; Luft, Joseph R; Snell, Edward H; Mukherjee, Sayan; Charbonneau, Patrick

    2014-01-01

    X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.

  7. Properties of a memory network in psychology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wedemann, Roseli S.; Donangelo, Raul; Carvalho, Luis A. V. de

    We have previously described neurotic psychopathology and psychoanalytic working-through by an associative memory mechanism, based on a neural network model, where memory was modelled by a Boltzmann machine (BM). Since brain neural topology is selectively structured, we simulated known microscopic mechanisms that control synaptic properties, showing that the network self-organizes to a hierarchical, clustered structure. Here, we show some statistical mechanical properties of the complex networks which result from this self-organization. They indicate that a generalization of the BM may be necessary to model memory.

  8. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  9. Properties of a memory network in psychology

    NASA Astrophysics Data System (ADS)

    Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.

    2007-12-01

    We have previously described neurotic psychopathology and psychoanalytic working-through by an associative memory mechanism, based on a neural network model, where memory was modelled by a Boltzmann machine (BM). Since brain neural topology is selectively structured, we simulated known microscopic mechanisms that control synaptic properties, showing that the network self-organizes to a hierarchical, clustered structure. Here, we show some statistical mechanical properties of the complex networks which result from this self-organization. They indicate that a generalization of the BM may be necessary to model memory.

  10. The need for conducting forensic analysis of decommissioned bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...

  11. Acoustic Mechanisms of a Species-Based Discrimination of the chick-a-dee Call in Sympatric Black-Capped (Poecile atricapillus) and Mountain Chickadees (P. gambeli).

    PubMed

    Guillette, Lauren M; Farrell, Tara M; Hoeschele, Marisa; Sturdy, Christopher B

    2010-01-01

    Previous perceptual research with black-capped and mountain chickadees has demonstrated that these species treat each other's namesake chick-a-dee calls as belonging to separate, open-ended categories. Further, the terminal dee portion of the call has been implicated as the most prominent species marker. However, statistical classification using acoustic summary features suggests that all note-types contained within the chick-a-dee call should be sufficient for species classification. The current study seeks to better understand the note-type based mechanisms underlying species-based classification of the chick-a-dee call by black-capped and mountain chickadees. In two, complementary, operant discrimination experiments, both species were trained to discriminate the species of the signaler using either entire chick-a-dee calls, or individual note-types from chick-a-dee calls. In agreement with previous perceptual work we find that the D note had significant stimulus control over species-based discrimination. However, in line with statistical classifications, we find that all note-types carry species information. We discuss reasons why the most easily discriminated note-types are likely candidates to carry species-based cues.

  12. Many-Body Localization and Thermalization in Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Nandkishore, Rahul; Huse, David A.

    2015-03-01

    We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.

  13. Statistical analysis of early failures in electromigration

    NASA Astrophysics Data System (ADS)

    Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.

    2001-07-01

    The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.

  14. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  15. Computer program for calculation of ideal gas thermodynamic data

    NASA Technical Reports Server (NTRS)

    Gordon, S.; Mc Bride, B. J.

    1968-01-01

    Computer program calculates ideal gas thermodynamic properties for any species for which molecular constant data is available. Partial functions and derivatives from formulas based on statistical mechanics are provided by the program which is written in FORTRAN 4 and MAP.

  16. A Generalized Quantum Theory

    NASA Astrophysics Data System (ADS)

    Niestegge, Gerd

    2014-09-01

    In quantum mechanics, the selfadjoint Hilbert space operators play a triple role as observables, generators of the dynamical groups and statistical operators defining the mixed states. One might expect that this is typical of Hilbert space quantum mechanics, but it is not. The same triple role occurs for the elements of a certain ordered Banach space in a much more general theory based upon quantum logics and a conditional probability calculus (which is a quantum logical model of the Lueders-von Neumann measurement process). It is shown how positive groups, automorphism groups, Lie algebras and statistical operators emerge from one major postulate - the non-existence of third-order interference (third-order interference and its impossibility in quantum mechanics were discovered by R. Sorkin in 1994). This again underlines the power of the combination of the conditional probability calculus with the postulate that there is no third-order interference. In two earlier papers, its impact on contextuality and nonlocality had already been revealed.

  17. Science and Facebook: The same popularity law!

    PubMed

    Néda, Zoltán; Varga, Levente; Biró, Tamás S

    2017-01-01

    The distribution of scientific citations for publications selected with different rules (author, topic, institution, country, journal, etc…) collapse on a single curve if one plots the citations relative to their mean value. We find that the distribution of "shares" for the Facebook posts rescale in the same manner to the very same curve with scientific citations. This finding suggests that citations are subjected to the same growth mechanism with Facebook popularity measures, being influenced by a statistically similar social environment and selection mechanism. In a simple master-equation approach the exponential growth of the number of publications and a preferential selection mechanism leads to a Tsallis-Pareto distribution offering an excellent description for the observed statistics. Based on our model and on the data derived from PubMed we predict that according to the present trend the average citations per scientific publications exponentially relaxes to about 4.

  18. Science and Facebook: The same popularity law!

    PubMed Central

    Varga, Levente; Biró, Tamás S.

    2017-01-01

    The distribution of scientific citations for publications selected with different rules (author, topic, institution, country, journal, etc…) collapse on a single curve if one plots the citations relative to their mean value. We find that the distribution of “shares” for the Facebook posts rescale in the same manner to the very same curve with scientific citations. This finding suggests that citations are subjected to the same growth mechanism with Facebook popularity measures, being influenced by a statistically similar social environment and selection mechanism. In a simple master-equation approach the exponential growth of the number of publications and a preferential selection mechanism leads to a Tsallis-Pareto distribution offering an excellent description for the observed statistics. Based on our model and on the data derived from PubMed we predict that according to the present trend the average citations per scientific publications exponentially relaxes to about 4. PMID:28678796

  19. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  20. A Hierarchical Approach to Fracture Mechanics

    NASA Technical Reports Server (NTRS)

    Saether, Erik; Taasan, Shlomo

    2004-01-01

    Recent research conducted under NASA LaRC's Creativity and Innovation Program has led to the development of an initial approach for a hierarchical fracture mechanics. This methodology unites failure mechanisms occurring at different length scales and provides a framework for a physics-based theory of fracture. At the nanoscale, parametric molecular dynamic simulations are used to compute the energy associated with atomic level failure mechanisms. This information is used in a mesoscale percolation model of defect coalescence to obtain statistics of fracture paths and energies through Monte Carlo simulations. The mathematical structure of predicted crack paths is described using concepts of fractal geometry. The non-integer fractal dimension relates geometric and energy measures between meso- and macroscales. For illustration, a fractal-based continuum strain energy release rate is derived for inter- and transgranular fracture in polycrystalline metals.

  1. Confinement, holonomy, and correlated instanton-dyon ensemble: SU(2) Yang-Mills theory

    NASA Astrophysics Data System (ADS)

    Lopez-Ruiz, Miguel Angel; Jiang, Yin; Liao, Jinfeng

    2018-03-01

    The mechanism of confinement in Yang-Mills theories remains a challenge to our understanding of nonperturbative gauge dynamics. While it is widely perceived that confinement may arise from chromomagnetically charged gauge configurations with nontrivial topology, it is not clear what types of configurations could do that and how, in pure Yang-Mills and QCD-like (nonsupersymmetric) theories. Recently, a promising approach has emerged, based on statistical ensembles of dyons/anti-dyons that are constituents of instanton/anti-instanton solutions with nontrivial holonomy where the holonomy plays a vital role as an effective "Higgsing" mechanism. We report a thorough numerical investigation of the confinement dynamics in S U (2 ) Yang-Mills theory by constructing such a statistical ensemble of correlated instanton-dyons.

  2. A life in statistical mechanics. Part 1: From Chedar in Taceva to Yeshiva University in New York

    NASA Astrophysics Data System (ADS)

    Lebowitz, Joel L.; Bonolis, Luisa

    2017-02-01

    This is the first part of an oral history interview on the lifelong involvement of Joel Lebowitz in the development of statistical mechanics. Here the covered topics include the formative years, which overlapped the tragic period of Nazi power and World War II in Europe, the emigration to the United States in 1946 and the schooling there. It also includes the beginnings and early scientific works with Peter Bergmann, Oliver Penrose and many others. The second part will appear in a forthcoming issue of Eur. Phys. J. H. The text presented here has been revised by the authors based on the original oral history interview conducted by Luisa Bonolis and recorded in Paris, France, 11-16 October 2014.

  3. A quantitative link between microplastic instability and macroscopic deformation behaviors in metallic glasses

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Chen, G. L.; Hui, X. D.; Liu, C. T.; Lin, Y.; Shang, X. C.; Lu, Z. P.

    2009-10-01

    Based on mechanical instability of individual shear transformation zones (STZs), a quantitative link between the microplastic instability and macroscopic deformation behavior of metallic glasses was proposed. Our analysis confirms that macroscopic metallic glasses comprise a statistical distribution of STZ embryos with distributed values of activation energy, and the microplastic instability of all the individual STZs dictates the macroscopic deformation behavior of amorphous solids. The statistical model presented in this paper can successfully reproduce the macroscopic stress-strain curves determined experimentally and readily be used to predict strain-rate effects on the macroscopic responses with the availability of the material parameters at a certain strain rate, which offer new insights into understanding the actual deformation mechanism in amorphous solids.

  4. Statistical mechanics of competitive resource allocation using agent-based models

    NASA Astrophysics Data System (ADS)

    Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.

    2015-01-01

    Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.

  5. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  6. Spatio-temporal analysis of aftershock sequences in terms of Non Extensive Statistical Physics.

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kalliopi; Vallianatos, Filippos

    2017-04-01

    Earth's seismicity is considered as an extremely complicated process where long-range interactions and fracturing exist (Vallianatos et al., 2016). For this reason, in order to analyze it, we use an innovative methodological approach, introduced by Tsallis (Tsallis, 1988; 2009), named Non Extensive Statistical Physics. This approach introduce a generalization of the Boltzmann-Gibbs statistical mechanics and it is based on the definition of Tsallis entropy Sq, which maximized leads the the so-called q-exponential function that expresses the probability distribution function that maximizes the Sq. In the present work, we utilize the concept of Non Extensive Statistical Physics in order to analyze the spatiotemporal properties of several aftershock series. Marekova (Marekova, 2014) suggested that the probability densities of the inter-event distances between successive aftershocks follow a beta distribution. Using the same data set we analyze the inter-event distance distribution of several aftershocks sequences in different geographic regions by calculating non extensive parameters that determine the behavior of the system and by fitting the q-exponential function, which expresses the degree of non-extentivity of the investigated system. Furthermore, the inter-event times distribution of the aftershocks as well as the frequency-magnitude distribution has been analyzed. The results supports the applicability of Non Extensive Statistical Physics ideas in aftershock sequences where a strong correlation exists along with memory effects. References C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. doi:10.1007/BF01016429 C. Tsallis, Introduction to nonextensive statistical mechanics: Approaching a complex world, 2009. doi:10.1007/978-0-387-85359-8. E. Marekova, Analysis of the spatial distribution between successive earthquakes in aftershocks series, Annals of Geophysics, 57, 5, doi:10.4401/ag-6556, 2014 F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016.

  7. Validation of a Statistical Methodology for Extracting Vegetation Feedbacks: Focus on North African Ecosystems in the Community Earth System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Yan; Notaro, Michael; Wang, Fuyao

    Generalized equilibrium feedback assessment (GEFA) is a potentially valuable multivariate statistical tool for extracting vegetation feedbacks to the atmosphere in either observations or coupled Earth system models. The reliability of GEFA at capturing the terrestrial impacts on regional climate is demonstrated in this paper using the National Center for Atmospheric Research Community Earth System Model (CESM), with focus on North Africa. The feedback is assessed statistically by applying GEFA to output from a fully coupled control run. To reduce the sampling error caused by short data records, the traditional or full GEFA is refined through stepwise GEFA by dropping unimportantmore » forcings. Two ensembles of dynamical experiments are developed for the Sahel or West African monsoon region against which GEFA-based vegetation feedbacks are evaluated. In these dynamical experiments, regional leaf area index (LAI) is modified either alone or in conjunction with soil moisture, with the latter runs motivated by strong regional soil moisture–LAI coupling. Stepwise GEFA boasts higher consistency between statistically and dynamically assessed atmospheric responses to land surface anomalies than full GEFA, especially with short data records. GEFA-based atmospheric responses are more consistent with the coupled soil moisture–LAI experiments, indicating that GEFA is assessing the combined impacts of coupled vegetation and soil moisture. Finally, both the statistical and dynamical assessments reveal a negative vegetation–rainfall feedback in the Sahel associated with an atmospheric stability mechanism in CESM versus a weaker positive feedback in the West African monsoon region associated with a moisture recycling mechanism in CESM.« less

  8. Validation of a Statistical Methodology for Extracting Vegetation Feedbacks: Focus on North African Ecosystems in the Community Earth System Model

    DOE PAGES

    Yu, Yan; Notaro, Michael; Wang, Fuyao; ...

    2018-02-05

    Generalized equilibrium feedback assessment (GEFA) is a potentially valuable multivariate statistical tool for extracting vegetation feedbacks to the atmosphere in either observations or coupled Earth system models. The reliability of GEFA at capturing the terrestrial impacts on regional climate is demonstrated in this paper using the National Center for Atmospheric Research Community Earth System Model (CESM), with focus on North Africa. The feedback is assessed statistically by applying GEFA to output from a fully coupled control run. To reduce the sampling error caused by short data records, the traditional or full GEFA is refined through stepwise GEFA by dropping unimportantmore » forcings. Two ensembles of dynamical experiments are developed for the Sahel or West African monsoon region against which GEFA-based vegetation feedbacks are evaluated. In these dynamical experiments, regional leaf area index (LAI) is modified either alone or in conjunction with soil moisture, with the latter runs motivated by strong regional soil moisture–LAI coupling. Stepwise GEFA boasts higher consistency between statistically and dynamically assessed atmospheric responses to land surface anomalies than full GEFA, especially with short data records. GEFA-based atmospheric responses are more consistent with the coupled soil moisture–LAI experiments, indicating that GEFA is assessing the combined impacts of coupled vegetation and soil moisture. Finally, both the statistical and dynamical assessments reveal a negative vegetation–rainfall feedback in the Sahel associated with an atmospheric stability mechanism in CESM versus a weaker positive feedback in the West African monsoon region associated with a moisture recycling mechanism in CESM.« less

  9. Analysis of swarm behaviors based on an inversion of the fluctuation theorem.

    PubMed

    Hamann, Heiko; Schmickl, Thomas; Crailsheim, Karl

    2014-01-01

    A grand challenge in the field of artificial life is to find a general theory of emergent self-organizing systems. In swarm systems most of the observed complexity is based on motion of simple entities. Similarly, statistical mechanics focuses on collective properties induced by the motion of many interacting particles. In this article we apply methods from statistical mechanics to swarm systems. We try to explain the emergent behavior of a simulated swarm by applying methods based on the fluctuation theorem. Empirical results indicate that swarms are able to produce negative entropy within an isolated subsystem due to frozen accidents. Individuals of a swarm are able to locally detect fluctuations of the global entropy measure and store them, if they are negative entropy productions. By accumulating these stored fluctuations over time the swarm as a whole is producing negative entropy and the system ends up in an ordered state. We claim that this indicates the existence of an inverted fluctuation theorem for emergent self-organizing dissipative systems. This approach bears the potential of general applicability.

  10. Facile Fabrication of 100% Bio-Based and Degradable Ternary Cellulose/PHBV/PLA Composites

    PubMed Central

    Wang, Jinwu

    2018-01-01

    Modifying bio-based degradable polymers such as polylactide (PLA) and poly(hydroxybutyrate-co-hydroxyvalerate) (PHBV) with non-degradable agents will compromise the 100% degradability of their resultant composites. This work developed a facile and solvent-free route in order to fabricate 100% bio-based and degradable ternary cellulose/PHBV/PLA composite materials. The effects of ball milling on the physicochemical properties of pulp cellulose fibers, and the ball-milled cellulose particles on the morphology and mechanical properties of PHBV/PLA blends, were investigated experimentally and statistically. The results showed that more ball-milling time resulted in a smaller particle size and lower crystallinity by way of mechanical disintegration. Filling PHBV/PLA blends with the ball-milled celluloses dramatically increased the stiffness at all of the levels of particle size and filling content, and improved their elongation at the break and fracture work at certain levels of particle size and filling content. It was also found that the high filling content of the ball-milled cellulose particles was detrimental to the mechanical properties for the resultant composite materials. The ternary cellulose/PHBV/PLA composite materials have some potential applications, such as in packaging materials and automobile inner decoration parts. Furthermore, filling content contributes more to the variations of their mechanical properties than particle size does. Statistical analysis combined with experimental tests provide a new pathway to quantitatively evaluate the effects of multiple variables on a specific property, and figure out the dominant one for the resultant composite materials. PMID:29495315

  11. Statistical Analysis of the Processes Controlling Choline and Ethanolamine Glycerophospholipid Molecular Species Composition

    PubMed Central

    Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey

    2012-01-01

    The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143

  12. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  13. Partitioning-based mechanisms under personalized differential privacy.

    PubMed

    Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian

    2017-05-01

    Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t -round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms.

  14. Partitioning-based mechanisms under personalized differential privacy

    PubMed Central

    Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian

    2017-01-01

    Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t-round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms. PMID:28932827

  15. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    USGS Publications Warehouse

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  16. Statistical mechanics framework for static granular matter.

    PubMed

    Henkes, Silke; Chakraborty, Bulbul

    2009-06-01

    The physical properties of granular materials have been extensively studied in recent years. So far, however, there exists no theoretical framework which can explain the observations in a unified manner beyond the phenomenological jamming diagram. This work focuses on the case of static granular matter, where we have constructed a statistical ensemble which mirrors equilibrium statistical mechanics. This ensemble, which is based on the conservation properties of the stress tensor, is distinct from the original Edwards ensemble and applies to packings of deformable grains. We combine it with a field theoretical analysis of the packings, where the field is the Airy stress function derived from the force and torque balance conditions. In this framework, Point J characterized by a diverging stiffness of the pressure fluctuations. Separately, we present a phenomenological mean-field theory of the jamming transition, which incorporates the mean contact number as a variable. We link both approaches in the context of the marginal rigidity picture proposed by Wyart and others.

  17. A statistical mechanics model for free-for-all airplane passenger boarding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steffen, Jason H.; /Fermilab

    2008-08-01

    I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. Themore » model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty.« less

  18. Random walk to a nonergodic equilibrium concept

    NASA Astrophysics Data System (ADS)

    Bel, G.; Barkai, E.

    2006-01-01

    Random walk models, such as the trap model, continuous time random walks, and comb models, exhibit weak ergodicity breaking, when the average waiting time is infinite. The open question is, what statistical mechanical theory replaces the canonical Boltzmann-Gibbs theory for such systems? In this paper a nonergodic equilibrium concept is investigated, for a continuous time random walk model in a potential field. In particular we show that in the nonergodic phase the distribution of the occupation time of the particle in a finite region of space approaches U- or W-shaped distributions related to the arcsine law. We show that when conditions of detailed balance are applied, these distributions depend on the partition function of the problem, thus establishing a relation between the nonergodic dynamics and canonical statistical mechanics. In the ergodic phase the distribution function of the occupation times approaches a δ function centered on the value predicted based on standard Boltzmann-Gibbs statistics. The relation of our work to single-molecule experiments is briefly discussed.

  19. Perceptual statistical learning over one week in child speech production.

    PubMed

    Richtsmeier, Peter T; Goffman, Lisa

    2017-07-01

    What cognitive mechanisms account for the trajectory of speech sound development, in particular, gradually increasing accuracy during childhood? An intriguing potential contributor is statistical learning, a type of learning that has been studied frequently in infant perception but less often in child speech production. To assess the relevance of statistical learning to developing speech accuracy, we carried out a statistical learning experiment with four- and five-year-olds in which statistical learning was examined over one week. Children were familiarized with and tested on word-medial consonant sequences in novel words. There was only modest evidence for statistical learning, primarily in the first few productions of the first session. This initial learning effect nevertheless aligns with previous statistical learning research. Furthermore, the overall learning effect was similar to an estimate of weekly accuracy growth based on normative studies. The results implicate other important factors in speech sound development, particularly learning via production. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Evidence of the non-extensive character of Earth's ambient noise.

    NASA Astrophysics Data System (ADS)

    Koutalonis, Ioannis; Vallianatos, Filippos

    2017-04-01

    Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.

  1. The Energetic Cost of Walking: A Comparison of Predictive Methods

    PubMed Central

    Kramer, Patricia Ann; Sylvester, Adam D.

    2011-01-01

    Background The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is “best”, but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. Methodology/Principal Findings We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Conclusion Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species. PMID:21731693

  2. Multi-agent Negotiation Mechanisms for Statistical Target Classification in Wireless Multimedia Sensor Networks

    PubMed Central

    Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng

    2007-01-01

    The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223

  3. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  4. A Mechanism for Anonymous Credit Card Systems

    NASA Astrophysics Data System (ADS)

    Tamura, Shinsuke; Yanase, Tatsuro

    This paper proposes a mechanism for anonymous credit card systems, in which each credit card holder can conceal individual transactions from the credit card company, while enabling the credit card company to calculate the total expenditures of transactions of individual card holders during specified periods, and to identify card holders who executed dishonest transactions. Based on three existing mechanisms, i.e. anonymous authentication, blind signature and secure statistical data gathering, together with implicit transaction links proposed here, the proposed mechanism enables development of anonymous credit card systems without assuming any absolutely trustworthy entity like tamper resistant devices or organizations faithful both to the credit card company and card holders.

  5. Statistical Mechanical Derivation of Jarzynski's Identity for Thermostated Non-Hamiltonian Dynamics

    NASA Astrophysics Data System (ADS)

    Cuendet, Michel A.

    2006-03-01

    The recent Jarzynski identity (JI) relates thermodynamic free energy differences to nonequilibrium work averages. Several proofs of the JI have been provided on the thermodynamic level. They rely on assumptions such as equivalence of ensembles in the thermodynamic limit or weakly coupled infinite heat baths. However, the JI is widely applied to NVT computer simulations involving finite numbers of particles, whose equations of motion are strongly coupled to a few extra degrees of freedom modeling a thermostat. In this case, the above assumptions are no longer valid. We propose a statistical mechanical approach to the JI solely based on the specific equations of motion, without any further assumption. We provide a detailed derivation for the non-Hamiltonian Nosé-Hoover dynamics, which is routinely used in computer simulations to produce canonical sampling.

  6. Structural and thermomechanical properties of the zinc-blende AlX (X = P, As, Sb) compounds

    NASA Astrophysics Data System (ADS)

    Ha, Vu Thi Thanh; Hung, Vu Van; Hanh, Pham Thi Minh; Nguyen, Viet Tuyen; Hieu, Ho Khac

    2017-08-01

    The structural and thermomechanical properties of zinc-blende aluminum class of III-V compounds have been studied based on the statistical moment method (SMM) in quantum statistical mechanics. Within the SMM scheme, we derived the analytical expressions of the nearest-neighbor distance, thermal expansion coefficient, atomic mean-square displacement and elastic moduli (Young’s modulus, bulk modulus and shear modulus). Numerical calculations have been performed for zinc-blende AlX (X = As, P, Sb) at ambient conditions up to the temperature of 1000 K. Our results are in good and reasonable agreements with earlier measurements and can provide useful references for future experimental and theoretical works. This research presents a systematic approach to investigate the thermodynamic and mechanical properties of materials.

  7. Intuition in Business: Empirical Base

    ERIC Educational Resources Information Center

    Fomin, Eugeniy P.; Alekseev, Andrey A.; Fomina, Natalia E.; Rensh, Marina A.; Zaitseva, Ekaterina V.

    2016-01-01

    In this article, the authors propose economic projection of the views of Daniel Kahneman on intuition. The authors believe intuition to act as an operative category in entrepreneurship. The results of given statistical experiment prove viability of the phenomenon of intuition when making investment decisions. Two independent mechanisms for…

  8. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  9. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.

    PubMed

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-19

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  10. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    PubMed Central

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  11. Intuitive statistics by 8-month-old infants

    PubMed Central

    Xu, Fei; Garcia, Vashti

    2008-01-01

    Human learners make inductive inferences based on small amounts of data: we generalize from samples to populations and vice versa. The academic discipline of statistics formalizes these intuitive statistical inferences. What is the origin of this ability? We report six experiments investigating whether 8-month-old infants are “intuitive statisticians.” Our results showed that, given a sample, the infants were able to make inferences about the population from which the sample had been drawn. Conversely, given information about the entire population of relatively small size, the infants were able to make predictions about the sample. Our findings provide evidence that infants possess a powerful mechanism for inductive learning, either using heuristics or basic principles of probability. This ability to make inferences based on samples or information about the population develops early and in the absence of schooling or explicit teaching. Human infants may be rational learners from very early in development. PMID:18378901

  12. Econophysical visualization of Adam Smith’s invisible hand

    NASA Astrophysics Data System (ADS)

    Cohen, Morrel H.; Eliazar, Iddo I.

    2013-02-01

    Consider a complex system whose macrostate is statistically observable, but yet whose operating mechanism is an unknown black-box. In this paper we address the problem of inferring, from the system’s macrostate statistics, the system’s intrinsic force yielding the observed statistics. The inference is established via two diametrically opposite approaches which result in the very same intrinsic force: a top-down approach based on the notion of entropy, and a bottom-up approach based on the notion of Langevin dynamics. The general results established are applied to the problem of visualizing the intrinsic socioeconomic force-Adam Smith’s invisible hand-shaping the distribution of wealth in human societies. Our analysis yields quantitative econophysical representations of figurative socioeconomic forces, quantitative definitions of “poor” and “rich”, and a quantitative characterization of the “poor-get-poorer” and the “rich-get-richer” phenomena.

  13. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  14. Statistical mechanics explanation for the structure of ocean eddies and currents

    NASA Astrophysics Data System (ADS)

    Venaille, A.; Bouchet, F.

    2010-12-01

    The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.

  15. Effects of Peer Tutoring on Reading Self-Concept

    ERIC Educational Resources Information Center

    Flores, Marta; Duran, David

    2013-01-01

    This study investigates the development of the Reading Self-Concept and of the mechanisms underlying it, within a framework of a reading programme based on peer tutoring. The multiple methodological design adopted allowed for a quantitative approach which showed statistically significant changes in the Reading Self-Concept of those students who…

  16. Predicting Adolescent and Adult Antisocial Behavior among Adjudicated Delinquent Females

    ERIC Educational Resources Information Center

    Cernkovich, Stephen A.; Lanctot, Nadine; Giordano, Peggy C.

    2008-01-01

    Studies identifying the mechanisms underlying the causes and consequences of antisocial behavior among female delinquents as they transit to adulthood are scarce and have important limitations: Most are based on official statistics, they typically are restricted to normative samples, and rarely do they gather prospective data from samples of…

  17. Entropy Is Simple, Qualitatively.

    ERIC Educational Resources Information Center

    Lambert, Frank L.

    2002-01-01

    Suggests that qualitatively, entropy is simple. Entropy increase from a macro viewpoint is a measure of the dispersal of energy from localized to spread out at a temperature T. Fundamentally based on statistical and quantum mechanics, this approach is superior to the non-fundamental "disorder" as a descriptor of entropy change. (MM)

  18. Towards a statistical mechanical theory of active fluids.

    PubMed

    Marini Bettolo Marconi, Umberto; Maggi, Claudio

    2015-12-07

    We present a stochastic description of a model of N mutually interacting active particles in the presence of external fields and characterize its steady state behavior in the absence of currents. To reproduce the effects of the experimentally observed persistence of the trajectories of the active particles we consider a Gaussian force having a non-vanishing correlation time τ, whose finiteness is a measure of the activity of the system. With these ingredients we show that it is possible to develop a statistical mechanical approach similar to the one employed in the study of equilibrium liquids and to obtain the explicit form of the many-particle distribution function by means of the multidimensional unified colored noise approximation. Such a distribution plays a role analogous to the Gibbs distribution in equilibrium statistical mechanics and provides complete information about the microscopic state of the system. From here we develop a method to determine the one- and two-particle distribution functions in the spirit of the Born-Green-Yvon (BGY) equations of equilibrium statistical mechanics. The resulting equations which contain extra-correlations induced by the activity allow us to determine the stationary density profiles in the presence of external fields, the pair correlations and the pressure of active fluids. In the low density regime we obtained the effective pair potential ϕ(r) acting between two isolated particles separated by a distance, r, showing the existence of an effective attraction between them induced by activity. Based on these results, in the second half of the paper we propose a mean field theory as an approach simpler than the BGY hierarchy and use it to derive a van der Waals expression of the equation of state.

  19. From QCD-based hard-scattering to nonextensive statistical mechanical descriptions of transverse momentum spectra in high-energy p p and p p ¯ collisions

    DOE PAGES

    Wong, Cheuk-Yin; Wilk, Grzegorz; Cirto, Leonardo J. L.; ...

    2015-06-22

    Transverse spectra of both jets and hadrons obtained in high-energymore » $pp$ and $$p\\bar p $$ collisions at central rapidity exhibit power-law behavior of $$1/p_T^n$$ at high $$p_T$$. The power index $n$ is 4-5 for jet production and is slightly greater for hadron production. Furthermore, the hadron spectra spanning over 14 orders of magnitude down to the lowest $$p_T$$ region in $pp$ collisions at LHC can be adequately described by a single nonextensive statistical mechanical distribution that is widely used in other branches of science. This suggests indirectly the dominance of the hard-scattering process over essentially the whole $$p_T$$ region at central rapidity in $pp$ collisions at LHC. We show here direct evidences of such a dominance of the hard-scattering process by investigating the power index of UA1 jet spectra over an extended $$p_T$$ region and the two-particle correlation data of the STAR and PHENIX Collaborations in high-energy $pp$ and $$p \\bar p$$ collisions at central rapidity. We then study how the showering of the hard-scattering product partons alters the power index of the hadron spectra and leads to a hadron distribution that can be cast into a single-particle non-extensive statistical mechanical distribution. Lastly, because of such a connection, the non-extensive statistical mechanical distribution can be considered as a lowest-order approximation of the hard-scattering of partons followed by the subsequent process of parton showering that turns the jets into hadrons, in high energy $pp$ and $$p\\bar p$$ collisions.« less

  20. Ergodic theorem, ergodic theory, and statistical mechanics

    PubMed Central

    Moore, Calvin C.

    2015-01-01

    This perspective highlights the mean ergodic theorem established by John von Neumann and the pointwise ergodic theorem established by George Birkhoff, proofs of which were published nearly simultaneously in PNAS in 1931 and 1932. These theorems were of great significance both in mathematics and in statistical mechanics. In statistical mechanics they provided a key insight into a 60-y-old fundamental problem of the subject—namely, the rationale for the hypothesis that time averages can be set equal to phase averages. The evolution of this problem is traced from the origins of statistical mechanics and Boltzman's ergodic hypothesis to the Ehrenfests' quasi-ergodic hypothesis, and then to the ergodic theorems. We discuss communications between von Neumann and Birkhoff in the Fall of 1931 leading up to the publication of these papers and related issues of priority. These ergodic theorems initiated a new field of mathematical-research called ergodic theory that has thrived ever since, and we discuss some of recent developments in ergodic theory that are relevant for statistical mechanics. PMID:25691697

  1. Proof of the Spin Statistics Connection 2: Relativistic Theory

    NASA Astrophysics Data System (ADS)

    Santamato, Enrico; De Martini, Francesco

    2017-12-01

    The traditional standard theory of quantum mechanics is unable to solve the spin-statistics problem, i.e. to justify the utterly important "Pauli Exclusion Principle" but by the adoption of the complex standard relativistic quantum field theory. In a recent paper (Santamato and De Martini in Found Phys 45(7):858-873, 2015) we presented a proof of the spin-statistics problem in the nonrelativistic approximation on the basis of the "Conformal Quantum Geometrodynamics". In the present paper, by the same theory the proof of the spin-statistics theorem is extended to the relativistic domain in the general scenario of curved spacetime. The relativistic approach allows to formulate a manifestly step-by-step Weyl gauge invariant theory and to emphasize some fundamental aspects of group theory in the demonstration. No relativistic quantum field operators are used and the particle exchange properties are drawn from the conservation of the intrinsic helicity of elementary particles. It is therefore this property, not considered in the standard quantum mechanics, which determines the correct spin-statistics connection observed in Nature (Santamato and De Martini in Found Phys 45(7):858-873, 2015). The present proof of the spin-statistics theorem is simpler than the one presented in Santamato and De Martini (Found Phys 45(7):858-873, 2015), because it is based on symmetry group considerations only, without having recourse to frames attached to the particles. Second quantization and anticommuting operators are not necessary.

  2. Tsallis non-extensive statistics and solar wind plasma complexity

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.

    2015-03-01

    This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).

  3. Statistical testing of association between menstruation and migraine.

    PubMed

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  4. A statistical mechanical approach to restricted integer partition functions

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-05-01

    The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.

  5. Coupled local facilitation and global hydrologic inhibition drive landscape geometry in a patterned peatland

    NASA Astrophysics Data System (ADS)

    Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.

    2015-05-01

    Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing-canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.

  6. Coupled local facilitation and global hydrologic inhibition drive landscape geometry in a patterned peatland

    NASA Astrophysics Data System (ADS)

    Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.

    2015-01-01

    Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.

  7. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  8. Composite Materials Handbook. Volume 1. Polymer Matrix Composites Guidelines for Characterization of Structural Materials

    DTIC Science & Technology

    2002-06-17

    power law type (References 6.8.6.1(h) and (i)). Various attempts have been made to use fracture mechanics based methods for predicting failure of...participate in the MIL-HDBK-17 coordination activity . 7. All information and data contained in this handbook have been coordinated with industry and the U.S...for statistically- based properties ............................. 6 2.2.3 Issues of data equivalence

  9. Modeling of Pedestrian Flows Using Hybrid Models of Euler Equations and Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Bärwolff, Günter; Slawig, Thomas; Schwandt, Hartmut

    2007-09-01

    In the last years various systems have been developed for controlling, planning and predicting the traffic of persons and vehicles, in particular under security aspects. Going beyond pure counting and statistical models, approaches were found to be very adequate and accurate which are based on well-known concepts originally developed in very different research areas, namely continuum mechanics and computer science. In the present paper, we outline a continuum mechanical approach for the description of pedestrain flow.

  10. Statistical Physics Approaches to RNA Editing

    NASA Astrophysics Data System (ADS)

    Bundschuh, Ralf

    2012-02-01

    The central dogma of molecular Biology states that DNA is transcribed base by base into RNA which is in turn translated into proteins. However, some organisms edit their RNA before translation by inserting, deleting, or substituting individual or short stretches of bases. In many instances the mechanisms by which an organism recognizes the positions at which to edit or by which it performs the actual editing are unknown. One model system that stands out by its very high rate of on average one out of 25 bases being edited are the Myxomycetes, a class of slime molds. In this talk we will show how the computational methods and concepts from statistical Physics can be used to analyze DNA and protein sequence data to predict editing sites in these slime molds and to guide experiments that identified previously unknown types of editing as well as the complete set of editing events in the slime mold Physarum polycephalum.

  11. Continuum radiation from active galactic nuclei: A statistical study

    NASA Technical Reports Server (NTRS)

    Isobe, T.; Feigelson, E. D.; Singh, K. P.; Kembhavi, A.

    1986-01-01

    The physics of the continuum spectrum of active galactic nuclei (AGNs) was examined using a large data set and rigorous statistical methods. A data base was constructed for 469 objects which include radio selected quasars, optically selected quasars, X-ray selected AGNs, BL Lac objects, and optically unidentified compact radio sources. Each object has measurements of its radio, optical, X-ray core continuum luminosity, though many of them are upper limits. Since many radio sources have extended components, the core component were carefully selected out from the total radio luminosity. With survival analysis statistical methods, which can treat upper limits correctly, these data can yield better statistical results than those previously obtained. A variety of statistical tests are performed, such as the comparison of the luminosity functions in different subsamples, and linear regressions of luminosities in different bands. Interpretation of the results leads to the following tentative conclusions: the main emission mechanism of optically selected quasars and X-ray selected AGNs is thermal, while that of BL Lac objects is synchrotron; radio selected quasars may have two different emission mechanisms in the X-ray band; BL Lac objects appear to be special cases of the radio selected quasars; some compact radio sources show the possibility of synchrotron self-Compton (SSC) in the optical band; and the spectral index between the optical and the X-ray bands depends on the optical luminosity.

  12. Predicting the stochastic guiding of kinesin-driven microtubules in microfabricated tracks: a statistical-mechanics-based modeling approach.

    PubMed

    Lin, Chih-Tin; Meyhofer, Edgar; Kurabayashi, Katsuo

    2010-01-01

    Directional control of microtubule shuttles via microfabricated tracks is key to the development of controlled nanoscale mass transport by kinesin motor molecules. Here we develop and test a model to quantitatively predict the stochastic behavior of microtubule guiding when they mechanically collide with the sidewalls of lithographically patterned tracks. By taking into account appropriate probability distributions of microscopic states of the microtubule system, the model allows us to theoretically analyze the roles of collision conditions and kinesin surface densities in determining how the motion of microtubule shuttles is controlled. In addition, we experimentally observe the statistics of microtubule collision events and compare our theoretical prediction with experimental data to validate our model. The model will direct the design of future hybrid nanotechnology devices that integrate nanoscale transport systems powered by kinesin-driven molecular shuttles.

  13. Co-occurrence statistics as a language-dependent cue for speech segmentation.

    PubMed

    Saksida, Amanda; Langus, Alan; Nespor, Marina

    2017-05-01

    To what extent can language acquisition be explained in terms of different associative learning mechanisms? It has been hypothesized that distributional regularities in spoken languages are strong enough to elicit statistical learning about dependencies among speech units. Distributional regularities could be a useful cue for word learning even without rich language-specific knowledge. However, it is not clear how strong and reliable the distributional cues are that humans might use to segment speech. We investigate cross-linguistic viability of different statistical learning strategies by analyzing child-directed speech corpora from nine languages and by modeling possible statistics-based speech segmentations. We show that languages vary as to which statistical segmentation strategies are most successful. The variability of the results can be partially explained by systematic differences between languages, such as rhythmical differences. The results confirm previous findings that different statistical learning strategies are successful in different languages and suggest that infants may have to primarily rely on non-statistical cues when they begin their process of speech segmentation. © 2016 John Wiley & Sons Ltd.

  14. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  15. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    ERIC Educational Resources Information Center

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  16. Statistical Mechanical Proof of the Second Law of Thermodynamics based on Volume Entropy

    NASA Astrophysics Data System (ADS)

    Campisi, Michele

    2007-10-01

    As pointed out in [M. Campisi. Stud. Hist. Phil. M. P. 36 (2005) 275-290] the volume entropy (that is the logarithm of the volume of phase space enclosed by the constant energy hyper-surface) provides a good mechanical analogue of thermodynamic entropy because it satisfies the heat theorem and it is an adiabatic invariant. This property explains the ``equal'' sign in Clausius principle (Sf>=Si) in a purely mechanical way and suggests that the volume entropy might explain the ``larger than'' sign (i.e. the Law of Entropy Increase) if non adiabatic transformations were considered. Based on the principles of quantum mechanics here we prove that, provided the initial equilibrium satisfy the natural condition of decreasing ordering of probabilities, the expectation value of the volume entropy cannot decrease for arbitrary transformations performed by some external sources of work on a insulated system. This can be regarded as a rigorous quantum mechanical proof of the Second Law.

  17. Avalanches, loading and finite size effects in 2D amorphous plasticity: results from a finite element model

    NASA Astrophysics Data System (ADS)

    Sandfeld, Stefan; Budrikis, Zoe; Zapperi, Stefano; Fernandez Castellanos, David

    2015-02-01

    Crystalline plasticity is strongly interlinked with dislocation mechanics and nowadays is relatively well understood. Concepts and physical models of plastic deformation in amorphous materials on the other hand—where the concept of linear lattice defects is not applicable—still are lagging behind. We introduce an eigenstrain-based finite element lattice model for simulations of shear band formation and strain avalanches. Our model allows us to study the influence of surfaces and finite size effects on the statistics of avalanches. We find that even with relatively complex loading conditions and open boundary conditions, critical exponents describing avalanche statistics are unchanged, which validates the use of simpler scalar lattice-based models to study these phenomena.

  18. A Geometrical Approach to Bell's Theorem

    NASA Technical Reports Server (NTRS)

    Rubincam, David Parry

    2000-01-01

    Bell's theorem can be proved through simple geometrical reasoning, without the need for the Psi function, probability distributions, or calculus. The proof is based on N. David Mermin's explication of the Einstein-Podolsky-Rosen-Bohm experiment, which involves Stern-Gerlach detectors which flash red or green lights when detecting spin-up or spin-down. The statistics of local hidden variable theories for this experiment can be arranged in colored strips from which simple inequalities can be deduced. These inequalities lead to a demonstration of Bell's theorem. Moreover, all local hidden variable theories can be graphed in such a way as to enclose their statistics in a pyramid, with the quantum-mechanical result lying a finite distance beneath the base of the pyramid.

  19. Study on Conversion Between Momentum and Contrarian Based on Fractal Game

    NASA Astrophysics Data System (ADS)

    Wu, Xu; Song, Guanghui; Deng, Yan; Xu, Lin

    2015-06-01

    Based on the fractal game which is performed by the majority and the minority, the fractal market theory (FMT) is employed to describe the features of investors' decision-making. Accordingly, the process of fractal games is formed in order to analyze the statistical features of conversion between momentum and contrarian. The result shows that among three fractal game mechanisms, the statistical feature of simulated return rate series is much more similar to log returns on actual series. In addition, the conversion between momentum and contrarian is also extremely similar to real situation, which can reflect the effectiveness of using fractal game in analyzing the conversion between momentum and contrarian. Moreover, it also provides decision-making reference which helps investors develop effective investment strategy.

  20. Measuring Boltzmann's Constant with Carbon Dioxide

    ERIC Educational Resources Information Center

    Ivanov, Dragia; Nikolov, Stefan

    2013-01-01

    In this paper we present two experiments to measure Boltzmann's constant--one of the fundamental constants of modern-day physics, which lies at the base of statistical mechanics and thermodynamics. The experiments use very basic theory, simple equipment and cheap and safe materials yet provide very precise results. They are very easy and…

  1. Alternative to Proctoring in Introductory Statistics Community College Courses

    ERIC Educational Resources Information Center

    Feinman, Yalena

    2018-01-01

    The credibility of unsupervised exams, one of the biggest challenges of e-learning, is currently maintained by proctoring. However, little has been done to determine whether expensive and inconvenient proctoring is necessary. The purpose of this quantitative study was to determine whether the use of security mechanisms, based on the taxonomy of…

  2. Statistical thermodynamics unveils the dissolution mechanism of cellobiose.

    PubMed

    Nicol, Thomas W J; Isobe, Noriyuki; Clark, James H; Shimizu, Seishi

    2017-08-30

    In the study of the cellulose dissolution mechanism opinion is still divided. Here, the solution interaction components of the most prominent hypotheses for the driving force of cellulose dissolution were evaluated quantitatively. Combining a rigorous statistical thermodynamic theory and cellobiose solubility data in the presence of chloride salts, whose cations progress in the Hofmeister series (KCl, NaCl, LiCl and ZnCl 2 ), we have shown that cellobiose solubilization is driven by the preferential accumulation of salts around the solutes which is stronger than cellobiose hydration. Yet contrary to the classical chaotropy hypothesis, increasing salt concentration leads to cellobiose dehydration in the presence of the strongest solubilizer ZnCl 2 . However, thanks to cellobiose dehydration, cellobiose-salt interaction still remains preferential despite weakening salt accumulation. Based on such insights, the previous hypotheses based on hydrophobicity and polymer charging have also been evaluated quantitatively. Thus, our present study successfully paved a way towards identifying the basic driving forces for cellulose solubilization in a quantitative manner for the first time. When combined with unit additivity methods this quantitative information could lead to a full understanding of cellulose solubility.

  3. Origin of Pareto-like spatial distributions in ecosystems.

    PubMed

    Manor, Alon; Shnerb, Nadav M

    2008-12-31

    Recent studies of cluster distribution in various ecosystems revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this patch statistics is a manifestation of the law of proportionate effect. Mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (such as desertification) manifest themselves in a drastic change of the stability properties of spatial colonies.

  4. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  5. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  6. Variational and perturbative formulations of quantum mechanical/molecular mechanical free energy with mean-field embedding and its analytical gradients.

    PubMed

    Yamamoto, Takeshi

    2008-12-28

    Conventional quantum chemical solvation theories are based on the mean-field embedding approximation. That is, the electronic wavefunction is calculated in the presence of the mean field of the environment. In this paper a direct quantum mechanical/molecular mechanical (QM/MM) analog of such a mean-field theory is formulated based on variational and perturbative frameworks. In the variational framework, an appropriate QM/MM free energy functional is defined and is minimized in terms of the trial wavefunction that best approximates the true QM wavefunction in a statistically averaged sense. Analytical free energy gradient is obtained, which takes the form of the gradient of effective QM energy calculated in the averaged MM potential. In the perturbative framework, the above variational procedure is shown to be equivalent to the first-order expansion of the QM energy (in the exact free energy expression) about the self-consistent reference field. This helps understand the relation between the variational procedure and the exact QM/MM free energy as well as existing QM/MM theories. Based on this, several ways are discussed for evaluating non-mean-field effects (i.e., statistical fluctuations of the QM wavefunction) that are neglected in the mean-field calculation. As an illustration, the method is applied to an S(N)2 Menshutkin reaction in water, NH(3)+CH(3)Cl-->NH(3)CH(3) (+)+Cl(-), for which free energy profiles are obtained at the Hartree-Fock, MP2, B3LYP, and BHHLYP levels by integrating the free energy gradient. Non-mean-field effects are evaluated to be <0.5 kcal/mol using a Gaussian fluctuation model for the environment, which suggests that those effects are rather small for the present reaction in water.

  7. Capillarity theory for the fly-casting mechanism

    PubMed Central

    Trizac, Emmanuel; Levy, Yaakov; Wolynes, Peter G.

    2010-01-01

    Biomolecular folding and function are often coupled. During molecular recognition events, one of the binding partners may transiently or partially unfold, allowing more rapid access to a binding site. We describe a simple model for this fly-casting mechanism based on the capillarity approximation and polymer chain statistics. The model shows that fly casting is most effective when the protein unfolding barrier is small and the part of the chain which extends toward the target is relatively rigid. These features are often seen in known examples of fly casting in protein–DNA binding. Simulations of protein–DNA binding based on well-funneled native-topology models with electrostatic forces confirm the trends of the analytical theory. PMID:20133683

  8. Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics

    NASA Astrophysics Data System (ADS)

    Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.

    2003-03-01

    Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.

  9. Statistical learning of action: the role of conditional probability.

    PubMed

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  10. The difference between a dynamic and mechanical approach to stroke treatment.

    PubMed

    Helgason, Cathy M

    2007-06-01

    The current classification of stroke is based on causation, also called pathogenesis, and relies on binary logic faithful to the Aristotelian tradition. Accordingly, a pathology is or is not the cause of the stroke, is considered independent of others, and is the target for treatment. It is the subject for large double-blind randomized clinical therapeutic trials. The scientific view behind clinical trials is the fundamental concept that information is statistical, and causation is determined by probabilities. Therefore, the cause and effect relation will be determined by probability-theory-based statistics. This is the basis of evidence-based medicine, which calls for the results of such trials to be the basis for physician decisions regarding diagnosis and treatment. However, there are problems with the methodology behind evidence-based medicine. Calculations using probability-theory-based statistics regarding cause and effect are performed within an automatic system where there are known inputs and outputs. This method of research provides a framework of certainty with no surprise elements or outcomes. However, it is not a system or method that will come up with previously unknown variables, concepts, or universal principles; it is not a method that will give a new outcome; and it is not a method that allows for creativity, expertise, or new insight for problem solving.

  11. Conformational energy calculations on polypeptides and proteins: use of a statistical mechanical procedure for evaluating structure and properties.

    PubMed

    Scheraga, H A; Paine, G H

    1986-01-01

    We are using a variety of theoretical and computational techniques to study protein structure, protein folding, and higher-order structures. Our earlier work involved treatments of liquid water and aqueous solutions of nonpolar and polar solutes, computations of the stabilities of the fundamental structures of proteins and their packing arrangements, conformations of small cyclic and open-chain peptides, structures of fibrous proteins (collagen), structures of homologous globular proteins, introduction of special procedures as constraints during energy minimization of globular proteins, and structures of enzyme-substrate complexes. Recently, we presented a new methodology for predicting polypeptide structure (described here); the method is based on the calculation of the probable and average conformation of a polypeptide chain by the application of equilibrium statistical mechanics in conjunction with an adaptive, importance sampling Monte Carlo algorithm. As a test, it was applied to Met-enkephalin.

  12. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  13. Multilayer Statistical Intrusion Detection in Wireless Networks

    NASA Astrophysics Data System (ADS)

    Hamdi, Mohamed; Meddeb-Makhlouf, Amel; Boudriga, Noureddine

    2008-12-01

    The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs). This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT) is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.

  14. Can the behavioral sciences self-correct? A social epistemic study.

    PubMed

    Romero, Felipe

    2016-12-01

    Advocates of the self-corrective thesis argue that scientific method will refute false theories and find closer approximations to the truth in the long run. I discuss a contemporary interpretation of this thesis in terms of frequentist statistics in the context of the behavioral sciences. First, I identify experimental replications and systematic aggregation of evidence (meta-analysis) as the self-corrective mechanism. Then, I present a computer simulation study of scientific communities that implement this mechanism to argue that frequentist statistics may converge upon a correct estimate or not depending on the social structure of the community that uses it. Based on this study, I argue that methodological explanations of the "replicability crisis" in psychology are limited and propose an alternative explanation in terms of biases. Finally, I conclude suggesting that scientific self-correction should be understood as an interaction effect between inference methods and social structures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Cation solvation with quantum chemical effects modeled by a size-consistent multi-partitioning quantum mechanics/molecular mechanics method.

    PubMed

    Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi

    2017-07-21

    In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.

  16. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    ERIC Educational Resources Information Center

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  17. Statistical Analysis on the Mechanical Properties of Magnesium Alloys

    PubMed Central

    Liu, Ruoyu; Jiang, Xianquan; Zhang, Hongju; Zhang, Dingfei; Wang, Jingfeng; Pan, Fusheng

    2017-01-01

    Knowledge of statistical characteristics of mechanical properties is very important for the practical application of structural materials. Unfortunately, the scatter characteristics of magnesium alloys for mechanical performance remain poorly understood until now. In this study, the mechanical reliability of magnesium alloys is systematically estimated using Weibull statistical analysis. Interestingly, the Weibull modulus, m, of strength for magnesium alloys is as high as that for aluminum and steels, confirming the very high reliability of magnesium alloys. The high predictability in the tensile strength of magnesium alloys represents the capability of preventing catastrophic premature failure during service, which is essential for safety and reliability assessment. PMID:29113116

  18. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  19. Dependency properties of the amorphous alloy Co58Ni10Fe5Si11B16 on technological parameters of spinning

    NASA Astrophysics Data System (ADS)

    Frolov, A. M.; Tkachev, V. V.; Fedorets, A. N.; Pustovalov, E. V.; Kraynova, G. S.; Dolzhikov, S. V.; Ilin, N. V.; Tsesarskaya, A. K.

    2017-09-01

    The tapes are quickly quenched onto a rotating drum. The structure of mechanical and physical properties is studied depending on the spinning parameters. An approach is proposed for the classification of obtained bands based on the statistics of the microrelief of their surfaces.

  20. Schrödinger equation revisited

    PubMed Central

    Schleich, Wolfgang P.; Greenberger, Daniel M.; Kobe, Donald H.; Scully, Marlan O.

    2013-01-01

    The time-dependent Schrödinger equation is a cornerstone of quantum physics and governs all phenomena of the microscopic world. However, despite its importance, its origin is still not widely appreciated and properly understood. We obtain the Schrödinger equation from a mathematical identity by a slight generalization of the formulation of classical statistical mechanics based on the Hamilton–Jacobi equation. This approach brings out most clearly the fact that the linearity of quantum mechanics is intimately connected to the strong coupling between the amplitude and phase of a quantum wave. PMID:23509260

  1. Memory matters: influence from a cognitive map on animal space use.

    PubMed

    Gautestad, Arild O

    2011-10-21

    A vertebrate individual's cognitive map provides a capacity for site fidelity and long-distance returns to favorable patches. Fractal-geometrical analysis of individual space use based on collection of telemetry fixes makes it possible to verify the influence of a cognitive map on the spatial scatter of habitat use and also to what extent space use has been of a scale-specific versus a scale-free kind. This approach rests on a statistical mechanical level of system abstraction, where micro-scale details of behavioral interactions are coarse-grained to macro-scale observables like the fractal dimension of space use. In this manner, the magnitude of the fractal dimension becomes a proxy variable for distinguishing between main classes of habitat exploration and site fidelity, like memory-less (Markovian) Brownian motion and Levy walk and memory-enhanced space use like Multi-scaled Random Walk (MRW). In this paper previous analyses are extended by exploring MRW simulations under three scenarios: (1) central place foraging, (2) behavioral adaptation to resource depletion (avoidance of latest visited locations) and (3) transition from MRW towards Levy walk by narrowing memory capacity to a trailing time window. A generalized statistical-mechanical theory with the power to model cognitive map influence on individual space use will be important for statistical analyses of animal habitat preferences and the mechanics behind site fidelity and home ranges. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Australian Apprentice & Trainee Statistics: Mechanical Engineering and Fabrication Trades, 1995-1999. Australian Vocational Education & Training.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    Statistics regarding Australians participating in apprenticeships and traineeships in the mechanical engineering and fabrication trades in 1995-1999 were reviewed to provide an indication of where skill shortages may be occurring or will likely occur in relation to the following occupations: mechanical engineering trades; fabrication engineering…

  3. Application of the quantum spin glass theory to image restoration.

    PubMed

    Inoue, J I

    2001-04-01

    Quantum fluctuation is introduced into the Markov random-field model for image restoration in the context of a Bayesian approach. We investigate the dependence of the quantum fluctuation on the quality of a black and white image restoration by making use of statistical mechanics. We find that the maximum posterior marginal (MPM) estimate based on the quantum fluctuation gives a fine restoration in comparison with the maximum a posteriori estimate or the thermal fluctuation based MPM estimate.

  4. miRNet - dissecting miRNA-target interactions and functional associations through network-based visual analysis

    PubMed Central

    Fan, Yannan; Siklenka, Keith; Arora, Simran K.; Ribeiro, Paula; Kimmins, Sarah; Xia, Jianguo

    2016-01-01

    MicroRNAs (miRNAs) can regulate nearly all biological processes and their dysregulation is implicated in various complex diseases and pathological conditions. Recent years have seen a growing number of functional studies of miRNAs using high-throughput experimental technologies, which have produced a large amount of high-quality data regarding miRNA target genes and their interactions with small molecules, long non-coding RNAs, epigenetic modifiers, disease associations, etc. These rich sets of information have enabled the creation of comprehensive networks linking miRNAs with various biologically important entities to shed light on their collective functions and regulatory mechanisms. Here, we introduce miRNet, an easy-to-use web-based tool that offers statistical, visual and network-based approaches to help researchers understand miRNAs functions and regulatory mechanisms. The key features of miRNet include: (i) a comprehensive knowledge base integrating high-quality miRNA-target interaction data from 11 databases; (ii) support for differential expression analysis of data from microarray, RNA-seq and quantitative PCR; (iii) implementation of a flexible interface for data filtering, refinement and customization during network creation; (iv) a powerful fully featured network visualization system coupled with enrichment analysis. miRNet offers a comprehensive tool suite to enable statistical analysis and functional interpretation of various data generated from current miRNA studies. miRNet is freely available at http://www.mirnet.ca. PMID:27105848

  5. Statistical study of ductility-dip cracking induced plastic deformation in polycrystalline laser 3D printed Ni-based superalloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Dan; Xue, Jiawei; Zhang, Anfeng

    Ductility-dip cracking in Ni-based superalloy, resulting from heat treatment, is known to cause disastrous failure, but its mechanism is still not completely clear. A statistical study of the cracking behavior as a function of crystal orientation in a laser 3D-printed DL125L Ni-based superalloy polycrystal is investigated here using the synchrotron X-ray microdiffraction. The dislocation slip system in each of the forty crystal grains adjacent to the 300 μm long crack has been analyzed through Laue diffraction peak shapes. In all these grains, edge-type geometrically necessary dislocations (GNDs) dominate, and their dislocation line directions are almost parallel to the crack plane.more » Based on Schmid's law, the equivalent uniaxial tensile force direction is revealed normal to the trace of the crack. A qualitative mechanism is thus proposed. Thermal tensile stress perpendicular to the laser scanning direction is elevated due to a significant temperature gradient, and thus locations in the materials where the thermal stress exceeds the yield stress undergo plastic deformation mediated by GND activations. As the dislocations slip inside the crystal grains and pile up at the grain boundaries, local strain/stress keeps increasing, until the materials in these regions fail to sustain further deformation, leading to voids formation and cracks propagation.« less

  6. Statistical study of ductility-dip cracking induced plastic deformation in polycrystalline laser 3D printed Ni-based superalloy

    DOE PAGES

    Qian, Dan; Xue, Jiawei; Zhang, Anfeng; ...

    2017-06-06

    Ductility-dip cracking in Ni-based superalloy, resulting from heat treatment, is known to cause disastrous failure, but its mechanism is still not completely clear. A statistical study of the cracking behavior as a function of crystal orientation in a laser 3D-printed DL125L Ni-based superalloy polycrystal is investigated here using the synchrotron X-ray microdiffraction. The dislocation slip system in each of the forty crystal grains adjacent to the 300 μm long crack has been analyzed through Laue diffraction peak shapes. In all these grains, edge-type geometrically necessary dislocations (GNDs) dominate, and their dislocation line directions are almost parallel to the crack plane.more » Based on Schmid's law, the equivalent uniaxial tensile force direction is revealed normal to the trace of the crack. A qualitative mechanism is thus proposed. Thermal tensile stress perpendicular to the laser scanning direction is elevated due to a significant temperature gradient, and thus locations in the materials where the thermal stress exceeds the yield stress undergo plastic deformation mediated by GND activations. As the dislocations slip inside the crystal grains and pile up at the grain boundaries, local strain/stress keeps increasing, until the materials in these regions fail to sustain further deformation, leading to voids formation and cracks propagation.« less

  7. Evaluation of mechanical and thermal properties of commonly used denture base resins.

    PubMed

    Phoenix, Rodney D; Mansueto, Michael A; Ackerman, Neal A; Jones, Robert E

    2004-03-01

    The purpose of this investigation was to evaluate and compare the mechanical and thermal properties of 6 commonly used polymethyl methacrylate denture base resins. Sorption, solubility, color stability, adaptation, flexural stiffness, and hardness were assessed to determine compliance with ADA Specification No. 12. Thermal assessments were performed using differential scanning calorimetry and dynamic mechanical analysis. Results were assessed using statistical and observational analyses. All materials satisfied ADA requirements for sorption, solubility, and color stability. Adaptation testing indicated that microwave-activated systems provided better adaptation to associated casts than conventional heat-activated resins. According to flexural testing results, microwaveable resins were relatively stiff, while rubber-modified resins were more flexible. Differential scanning calorimetry indicated that microwave-activated systems were more completely polymerized than conventional heat-activated materials. The microwaveable resins displayed better adaptation, greater stiffness, and greater surface hardness than other denture base resins included in this investigation. Elastomeric toughening agents yielded decreased stiffness, decreased surface hardness, and decreased glass transition temperatures.

  8. Power Law Patch Scaling and Lack of Characteristic Wavelength Suggest "Scale-Free" Processes Drive Pattern Formation in the Florida Everglades

    NASA Astrophysics Data System (ADS)

    Kaplan, D. A.; Casey, S. T.; Cohen, M. J.; Acharya, S.; Jawitz, J. W.

    2016-12-01

    A century of hydrologic modification has altered the physical and biological drivers of landscape processes in the Everglades (Florida, USA). Restoring the ridge-slough patterned landscape, a dominant feature of the historical system, is a priority, but requires an understanding of pattern genesis and degradation mechanisms. Physical experiments to evaluate alternative pattern formation mechanisms are limited by the long time scales of peat accumulation and loss, necessitating model-based comparisons, where support for a particular mechanism is based on model replication of extant patterning and trajectories of degradation. However, multiple mechanisms yield patch elongation in the direction of historical flow (a central feature of ridge-slough patterning), limiting the utility of that characteristic for discriminating among alternatives. Using data from vegetation maps, we investigated the statistical features of ridge-slough spatial patterning (ridge density, patch perimeter, elongation, patch-size distributions, and spatial periodicity) to establish more rigorous criteria for evaluating model performance and to inform controls on pattern variation across the contemporary system. Two independent analyses (2-D periodograms and patch size distributions) provide strong evidence against regular patterning, with the landscape exhibiting neither a characteristic wavelength nor a characteristic patch size, both of which are expected under conditions that produce regular patterns. Rather, landscape properties suggest robust scale-free patterning, indicating genesis from the coupled effects of local facilitation and a global negative feedback operating uniformly at the landscape-scale. This finding challenges widespread invocation of scale-dependent negative feedbacks for explaining ridge-slough pattern origins. These results help discern among genesis mechanisms and provide an improved statistical description of the landscape that can be used to compare among model outputs, as well as to assess the success of future restoration projects.

  9. Mechanical properties and radiopacity of experimental glass-silica-metal hybrid composites.

    PubMed

    Jandt, Klaus D; Al-Jasser, Abdullah M O; Al-Ateeq, Khalid; Vowles, Richard W; Allen, Geoff C

    2002-09-01

    Experimental glass-silica-metal hybrid composites (polycomposites) were developed and tested mechanically and radiographically in this fundamental pilot study. To determine whether mechanical properties of a glass-silica filled two-paste dental composite based on a Bis-GMA/polyglycol dimethacrylate blend could be improved through the incorporation of titanium (Ti) particles (particle size ranging from 1 to 3 microm) or silver-tin-copper (Ag-Sn-Cu) particles (particle size ranging from 1 to 50 microm) we measured the diametral tensile strength, fracture toughness and radiopacity of five composites. The five materials were: I, the original unmodified composite (control group); II, as group I but containing 5% (wt/wt) of Ti particles; III, as group II but with Ti particles treated with 4-methacryloyloxyethyl trimellitate anhydride (4-META) to promote Ti-resin bonding; IV, as group I but containing 5% (wt/wt) of Ag-Sn-Cu particles; and V, as group IV but with the metal particles treated with 4-META. Ten specimens of each group were tested in a standard diametral tensile strength test and a fracture toughness test using a single-edge notched sample design and five specimens of each group were tested using a radiopacity test. The diametral tensile strength increased statistically significantly after incorporation of Ti treated with 4-META, as tested by ANOVA (P=0.004) and Fisher's LSD test. A statistically significant increase of fracture toughness was observed between the control group and groups II, III and V as tested by ANOVA (P=0.003) and Fisher's LSD test. All other groups showed no statistically significant increase in diametral tensile strength and fracture toughness respectively when compared to their control groups. No statistically significant increase in radiopacity was found between the control group and the Ti filled composite, whereas a statistically significant increase in radiopacity was found between the control group and the Ag-Sn-Cu filled composite as tested by ANOVA (P=0.000) and Fisher's LSD procedure. The introduction of titanium and silver-tin-copper fillers has potential as added components in composites to provide increased mechanical strength and radiopacity, for example for use in core materials.

  10. Observers Exploit Stochastic Models of Sensory Change to Help Judge the Passage of Time

    PubMed Central

    Ahrens, Misha B.; Sahani, Maneesh

    2011-01-01

    Summary Sensory stimulation can systematically bias the perceived passage of time [1–5], but why and how this happens is mysterious. In this report, we provide evidence that such biases may ultimately derive from an innate and adaptive use of stochastically evolving dynamic stimuli to help refine estimates derived from internal timekeeping mechanisms [6–15]. A simplified statistical model based on probabilistic expectations of stimulus change derived from the second-order temporal statistics of the natural environment [16, 17] makes three predictions. First, random noise-like stimuli whose statistics violate natural expectations should induce timing bias. Second, a previously unexplored obverse of this effect is that similar noise stimuli with natural statistics should reduce the variability of timing estimates. Finally, this reduction in variability should scale with the interval being timed, so as to preserve the overall Weber law of interval timing. All three predictions are borne out experimentally. Thus, in the context of our novel theoretical framework, these results suggest that observers routinely rely on sensory input to augment their sense of the passage of time, through a process of Bayesian inference based on expectations of change in the natural environment. PMID:21256018

  11. Neurophysiological Markers of Statistical Learning in Music and Language: Hierarchy, Entropy, and Uncertainty.

    PubMed

    Daikoku, Tatsuya

    2018-06-19

    Statistical learning (SL) is a method of learning based on the transitional probabilities embedded in sequential phenomena such as music and language. It has been considered an implicit and domain-general mechanism that is innate in the human brain and that functions independently of intention to learn and awareness of what has been learned. SL is an interdisciplinary notion that incorporates information technology, artificial intelligence, musicology, and linguistics, as well as psychology and neuroscience. A body of recent study has suggested that SL can be reflected in neurophysiological responses based on the framework of information theory. This paper reviews a range of work on SL in adults and children that suggests overlapping and independent neural correlations in music and language, and that indicates disability of SL. Furthermore, this article discusses the relationships between the order of transitional probabilities (TPs) (i.e., hierarchy of local statistics) and entropy (i.e., global statistics) regarding SL strategies in human's brains; claims importance of information-theoretical approaches to understand domain-general, higher-order, and global SL covering both real-world music and language; and proposes promising approaches for the application of therapy and pedagogy from various perspectives of psychology, neuroscience, computational studies, musicology, and linguistics.

  12. Manifold parametrization of the left ventricle for a statistical modelling of its complete anatomy

    NASA Astrophysics Data System (ADS)

    Gil, D.; Garcia-Barnes, J.; Hernández-Sabate, A.; Marti, E.

    2010-03-01

    Distortion of Left Ventricle (LV) external anatomy is related to some dysfunctions, such as hypertrophy. The architecture of myocardial fibers determines LV electromechanical activation patterns as well as mechanics. Thus, their joined modelling would allow the design of specific interventions (such as peacemaker implantation and LV remodelling) and therapies (such as resynchronization). On one hand, accurate modelling of external anatomy requires either a dense sampling or a continuous infinite dimensional approach, which requires non-Euclidean statistics. On the other hand, computation of fiber models requires statistics on Riemannian spaces. Most approaches compute separate statistical models for external anatomy and fibers architecture. In this work we propose a general mathematical framework based on differential geometry concepts for computing a statistical model including, both, external and fiber anatomy. Our framework provides a continuous approach to external anatomy supporting standard statistics. We also provide a straightforward formula for the computation of the Riemannian fiber statistics. We have applied our methodology to the computation of complete anatomical atlas of canine hearts from diffusion tensor studies. The orientation of fibers over the average external geometry agrees with the segmental description of orientations reported in the literature.

  13. The boundary is mixed

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo

    2017-08-01

    We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.

  14. Learning Predictive Statistics: Strategies and Brain Mechanisms.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-08-30

    When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to changes in the environment's statistics. We provide evidence for an alternate route for learning complex temporal statistics: extracting the most probable outcome in a given context is implemented by interactions between executive and motor corticostriatal mechanisms compared with visual corticostriatal circuits (including hippocampal cortex) that support learning of the exact temporal statistics. Copyright © 2017 Wang et al.

  15. Fuzzy model-based fault detection and diagnosis for a pilot heat exchanger

    NASA Astrophysics Data System (ADS)

    Habbi, Hacene; Kidouche, Madjid; Kinnaert, Michel; Zelmat, Mimoun

    2011-04-01

    This article addresses the design and real-time implementation of a fuzzy model-based fault detection and diagnosis (FDD) system for a pilot co-current heat exchanger. The design method is based on a three-step procedure which involves the identification of data-driven fuzzy rule-based models, the design of a fuzzy residual generator and the evaluation of the residuals for fault diagnosis using statistical tests. The fuzzy FDD mechanism has been implemented and validated on the real co-current heat exchanger, and has been proven to be efficient in detecting and isolating process, sensor and actuator faults.

  16. Dynamically biased statistical model for the ortho/para conversion in the H2 + H3+ → H3+ + H2 reaction.

    PubMed

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-07

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  17. Dynamically biased statistical model for the ortho/para conversion in the H2+H3+ --> H3++ H2 reaction

    NASA Astrophysics Data System (ADS)

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-01

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  18. Optimization of space system development resources

    NASA Astrophysics Data System (ADS)

    Kosmann, William J.; Sarkani, Shahram; Mazzuchi, Thomas

    2013-06-01

    NASA has had a decades-long problem with cost growth during the development of space science missions. Numerous agency-sponsored studies have produced average mission level cost growths ranging from 23% to 77%. A new study of 26 historical NASA Science instrument set developments using expert judgment to reallocate key development resources has an average cost growth of 73.77%. Twice in history, a barter-based mechanism has been used to reallocate key development resources during instrument development. The mean instrument set development cost growth was -1.55%. Performing a bivariate inference on the means of these two distributions, there is statistical evidence to support the claim that using a barter-based mechanism to reallocate key instrument development resources will result in a lower expected cost growth than using the expert judgment approach. Agent-based discrete event simulation is the natural way to model a trade environment. A NetLogo agent-based barter-based simulation of science instrument development was created. The agent-based model was validated against the Cassini historical example, as the starting and ending instrument development conditions are available. The resulting validated agent-based barter-based science instrument resource reallocation simulation was used to perform 300 instrument development simulations, using barter to reallocate development resources. The mean cost growth was -3.365%. A bivariate inference on the means was performed to determine that additional significant statistical evidence exists to support a claim that using barter-based resource reallocation will result in lower expected cost growth, with respect to the historical expert judgment approach. Barter-based key development resource reallocation should work on spacecraft development as well as it has worked on instrument development. A new study of 28 historical NASA science spacecraft developments has an average cost growth of 46.04%. As barter-based key development resource reallocation has never been tried in a spacecraft development, no historical results exist, and a simulation of using that approach must be developed. The instrument development simulation should be modified to account for spacecraft development market participant differences. The resulting agent-based barter-based spacecraft resource reallocation simulation would then be used to determine if significant statistical evidence exists to prove a claim that using barter-based resource reallocation will result in lower expected cost growth.

  19. Autonomous Congestion Control in Delay-Tolerant Networks

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott; Jennings, Esther; Schoolcraft, Joshua

    2006-01-01

    Congestion control is an important feature that directly affects network performance. Network congestion may cause loss of data or long delays. Although this problem has been studied extensively in the Internet, the solutions for Internet congestion control do not apply readily to challenged network environments such as Delay Tolerant Networks (DTN) where end-to-end connectivity may not exist continuously and latency can be high. In DTN, end-to-end rate control is not feasible. This calls for congestion control mechanisms where the decisions can be made autonomously with local information only. We use an economic pricing model and propose a rule-based congestion control mechanism where each router can autonomously decide on whether to accept a bundle (data) based on local information such as available storage and the value and risk of accepting the bundle (derived from historical statistics). Preliminary experimental results show that this congestion control mechanism can protect routers from resource depletion without loss of data.

  20. Theory of the Decoherence Effect in Finite and Infinite Open Quantum Systems Using the Algebraic Approach

    NASA Astrophysics Data System (ADS)

    Blanchard, Philippe; Hellmich, Mario; Ługiewicz, Piotr; Olkiewicz, Robert

    Quantum mechanics is the greatest revision of our conception of the character of the physical world since Newton. Consequently, David Hilbert was very interested in quantum mechanics. He and John von Neumann discussed it frequently during von Neumann's residence in Göttingen. He published in 1932 his book Mathematical Foundations of Quantum Mechanics. In Hilbert's opinion it was the first exposition of quantum mechanics in a mathematically rigorous way. The pioneers of quantum mechanics, Heisenberg and Dirac, neither had use for rigorous mathematics nor much interest in it. Conceptually, quantum theory as developed by Bohr and Heisenberg is based on the positivism of Mach as it describes only observable quantities. It first emerged as a result of experimental data in the form of statistical observations of quantum noise, the basic concept of quantum probability.

  1. The einstein equivalence principle, intrinsic spin and the invariance of constitutive equations in continuum mechanics

    NASA Technical Reports Server (NTRS)

    Speziale, Charles G.

    1988-01-01

    The invariance of constitutive equations in continuum mechanics is examined from a basic theoretical standpoint. It is demonstrated the constitutive equations which are not form invariant under arbitrary translational accelerations of the reference frame are in violation of the Einstein equivalane principle. Furthermore, by making use of an analysis based on statistical mechanics, it is argued that any frame-dependent terms in constitutive equations must arise from the intrinsic spin tensor and are negligible provided that the ratio of microscopic to macroscopic time scales is extremely small. The consistency of these results with existing constitutive theories is discussed in detail along with possible avenues of future research.

  2. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  3. Statistical mechanics in the context of special relativity.

    PubMed

    Kaniadakis, G

    2002-11-01

    In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of the ordinary statistical mechanics and is suitable to describe a very large class of experimentally observed phenomena in low and high energy physics and in natural, economic, and social sciences. Finally, in order to test the correctness and predictability of the theory, as working example we consider the cosmic rays spectrum, which spans 13 decades in energy and 33 decades in flux, finding a high quality agreement between our predictions and observed data.

  4. Sharpening method of satellite thermal image based on the geographical statistical model

    NASA Astrophysics Data System (ADS)

    Qi, Pengcheng; Hu, Shixiong; Zhang, Haijun; Guo, Guangmeng

    2016-04-01

    To improve the effectiveness of thermal sharpening in mountainous regions, paying more attention to the laws of land surface energy balance, a thermal sharpening method based on the geographical statistical model (GSM) is proposed. Explanatory variables were selected from the processes of land surface energy budget and thermal infrared electromagnetic radiation transmission, then high spatial resolution (57 m) raster layers were generated for these variables through spatially simulating or using other raster data as proxies. Based on this, the local adaptation statistical relationship between brightness temperature (BT) and the explanatory variables, i.e., the GSM, was built at 1026-m resolution using the method of multivariate adaptive regression splines. Finally, the GSM was applied to the high-resolution (57-m) explanatory variables; thus, the high-resolution (57-m) BT image was obtained. This method produced a sharpening result with low error and good visual effect. The method can avoid the blind choice of explanatory variables and remove the dependence on synchronous imagery at visible and near-infrared bands. The influences of the explanatory variable combination, sampling method, and the residual error correction on sharpening results were analyzed deliberately, and their influence mechanisms are reported herein.

  5. Modeling the Development of Audiovisual Cue Integration in Speech Perception

    PubMed Central

    Getz, Laura M.; Nordeen, Elke R.; Vrabic, Sarah C.; Toscano, Joseph C.

    2017-01-01

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues. PMID:28335558

  6. Modeling the Development of Audiovisual Cue Integration in Speech Perception.

    PubMed

    Getz, Laura M; Nordeen, Elke R; Vrabic, Sarah C; Toscano, Joseph C

    2017-03-21

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues.

  7. Infant Statistical Learning

    PubMed Central

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  8. Statistical inference approach to structural reconstruction of complex networks from binary time series

    NASA Astrophysics Data System (ADS)

    Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng

    2018-02-01

    Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.

  9. Statistical inference approach to structural reconstruction of complex networks from binary time series.

    PubMed

    Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng

    2018-02-01

    Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.

  10. Genetic algorithm dynamics on a rugged landscape

    NASA Astrophysics Data System (ADS)

    Bornholdt, Stefan

    1998-04-01

    The genetic algorithm is an optimization procedure motivated by biological evolution and is successfully applied to optimization problems in different areas. A statistical mechanics model for its dynamics is proposed based on the parent-child fitness correlation of the genetic operators, making it applicable to general fitness landscapes. It is compared to a recent model based on a maximum entropy ansatz. Finally it is applied to modeling the dynamics of a genetic algorithm on the rugged fitness landscape of the NK model.

  11. Evolution of the use of noninvasive mechanical ventilation in chronic obstructive pulmonary disease in a Spanish region, 1997-2010.

    PubMed

    Carpe-Carpe, Bienvenida; Hernando-Arizaleta, Lauro; Ibáñez-Pérez, M Carmen; Palomar-Rodríguez, Joaquín A; Esquinas-Rodríguez, Antonio M

    2013-08-01

    Noninvasive mechanical ventilation (NIV) appeared in the 1980s as an alternative to invasive mechanical ventilation (IMV) in patients with acute respiratory failure. We evaluated the introduction of NIV and the results in patients with acute exacerbation of chronic obstructive pulmonary disease in the Region of Murcia (Spain). A retrospective observational study based on the minimum basic hospital discharge data of all patients hospitalised for this pathology in all public hospitals in the region between 1997 and 2010. We performed a time trend analysis on hospital attendance, the use of each ventilatory intervention and hospital mortality through joinpoint regression. We identified 30.027 hospital discharges. Joinpoint analysis: downward trend in attendance (annual percentage change [APC]=-3.4, 95% CI: - 4.8; -2.0, P <.05) and in the group without ventilatory intervention (APC=-4.2%, -5.6; -2.8, P <.05); upward trend in the use of NIV (APC=16.4, 12.0; 20. 9, P <.05), and downward trend that was not statistically significant in IMV (APC=-4.5%, -10.3; 1.7). We observed an upward trend without statistical significance in overall mortality (APC=0.5, -1.3; 2.4) and in the group without intervention (APC=0.1, -1.6; 1.9); downward trend with statistical significance in the NIV group (APC=-7.1, -11.7; -2.2, P <.05) and not statistically significant in the IMV group (APC=-0,8, -6, 1; 4.8). The mean stay did not change substantially. The introduction of NIV has reduced the group of patients not receiving assisted ventilation. No improvement in results was found in terms of mortality or length of stay. Copyright © 2012 SEPAR. Published by Elsevier Espana. All rights reserved.

  12. Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew

    2006-01-01

    The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described by rate constants. These problems are isomorphic with chemical kinetics problems. Recently, several efficient techniques for this purpose have been developed based on the approach originally proposed by Gillespie. Although the utility of the techniques mentioned above for Bayesian problems has not been determined, further research along these lines is warranted

  13. A quantum framework for likelihood ratios

    NASA Astrophysics Data System (ADS)

    Bond, Rachael L.; He, Yang-Hui; Ormerod, Thomas C.

    The ability to calculate precise likelihood ratios is fundamental to science, from Quantum Information Theory through to Quantum State Estimation. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes’ theorem either defaults to the marginal probability driven “naive Bayes’ classifier”, or requires the use of compensatory expectation-maximization techniques. This paper takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement, and demonstrates that Bayes’ theorem is a special case of a more general quantum mechanical expression.

  14. Layered Manufacturing of Dental Ceramics: Fracture Mechanics, Microstructure, and Elemental Composition of Lithography-Sintered Ceramic.

    PubMed

    Uçar, Yurdanur; Aysan Meriç, İpek; Ekren, Orhun

    2018-02-11

    To compare the fracture mechanics, microstructure, and elemental composition of lithography-based ceramic manufacturing with pressing and CAD/CAM. Disc-shaped specimens (16 mm diameter, 1.2 mm thick) were used for mechanical testing (n = 10/group). Biaxial flexural strength of three groups (In-Ceram alumina [ICA], lithography-based alumina, ZirkonZahn) were determined using the "piston on 3-ball" technique as suggested in test Standard ISO-6872. Vickers hardness test was performed. Fracture toughness was calculated using fractography. Results were statistically analyzed using Kruskal-Wallis test followed by Dunnett T3 (α = 0.05). Weibull analysis was conducted. Polished and fracture surface characterization was made using scanning electron microscope (SEM). Energy dispersive spectroscopy (EDS) was used for elemental analysis. Biaxial flexural strength of ICA, LCM alumina (LCMA), and ZirkonZahn were 147 ± 43 MPa, 490 ± 44 MPa, and 709 ± 94 MPa, respectively, and were statistically different (P ≤ 0.05). The Vickers hardness number of ICA was 850 ± 41, whereas hardness values for LCMA and ZirkonZahn were 1581 ± 144 and 1249 ± 57, respectively, and were statistically different (P ≤ 0.05). A statistically significant difference was found between fracture toughness of ICA (2 ± 0.4 MPa⋅m 1/2 ), LCMA (6.5 ± 1.5 MPa⋅m 1/2 ), and ZirkonZahn (7.7 ± 1 MPa⋅m 1/2 ) (P ≤ 0.05). Weibull modulus was highest for LCMA (m = 11.43) followed by ZirkonZahn (m = 8.16) and ICA (m = 5.21). Unlike LCMA and ZirkonZahn groups, a homogeneous microstructure was not observed for ICA. EDS results supported the SEM images. Within the limitations of this in vitro study, it can be concluded that LCM seems to be a promising technique for final ceramic object manufacturing in dental applications. Both the manufacturing method and the material used should be improved. © 2018 by the American College of Prosthodontists.

  15. Stochastic mechanics of reciprocal diffusions

    NASA Astrophysics Data System (ADS)

    Levy, Bernard C.; Krener, Arthur J.

    1996-02-01

    The dynamics and kinematics of reciprocal diffusions were examined in a previous paper [J. Math. Phys. 34, 1846 (1993)], where it was shown that reciprocal diffusions admit a chain of conservation laws, which close after the first two laws for two disjoint subclasses of reciprocal diffusions, the Markov and quantum diffusions. For the case of quantum diffusions, the conservation laws are equivalent to Schrödinger's equation. The Markov diffusions were employed by Schrödinger [Sitzungsber. Preuss. Akad. Wiss. Phys. Math Kl. 144 (1931); Ann. Inst. H. Poincaré 2, 269 (1932)], Nelson [Dynamical Theories of Brownian Motion (Princeton University, Princeton, NJ, 1967); Quantum Fluctuations (Princeton University, Princeton, NJ, 1985)], and other researchers to develop stochastic formulations of quantum mechanics, called stochastic mechanics. We propose here an alternative version of stochastic mechanics based on quantum diffusions. A procedure is presented for constructing the quantum diffusion associated to a given wave function. It is shown that quantum diffusions satisfy the uncertainty principle, and have a locality property, whereby given two dynamically uncoupled but statistically correlated particles, the marginal statistics of each particle depend only on the local fields to which the particle is subjected. However, like Wigner's joint probability distribution for the position and momentum of a particle, the finite joint probability densities of quantum diffusions may take negative values.

  16. From Mechanical Motion to Brownian Motion, Thermodynamics and Particle Transport Theory

    ERIC Educational Resources Information Center

    Bringuier, E.

    2008-01-01

    The motion of a particle in a medium is dealt with either as a problem of mechanics or as a transport process in non-equilibrium statistical physics. The two kinds of approach are often unrelated as they are taught in different textbooks. The aim of this paper is to highlight the link between the mechanical and statistical treatments of particle…

  17. Effect of CorrelatedRotational Noise

    NASA Astrophysics Data System (ADS)

    Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna

    The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.

  18. A method for predicting DCT-based denoising efficiency for grayscale images corrupted by AWGN and additive spatially correlated noise

    NASA Astrophysics Data System (ADS)

    Rubel, Aleksey S.; Lukin, Vladimir V.; Egiazarian, Karen O.

    2015-03-01

    Results of denoising based on discrete cosine transform for a wide class of images corrupted by additive noise are obtained. Three types of noise are analyzed: additive white Gaussian noise and additive spatially correlated Gaussian noise with middle and high correlation levels. TID2013 image database and some additional images are taken as test images. Conventional DCT filter and BM3D are used as denoising techniques. Denoising efficiency is described by PSNR and PSNR-HVS-M metrics. Within hard-thresholding denoising mechanism, DCT-spectrum coefficient statistics are used to characterize images and, subsequently, denoising efficiency for them. Results of denoising efficiency are fitted for such statistics and efficient approximations are obtained. It is shown that the obtained approximations provide high accuracy of prediction of denoising efficiency.

  19. A detailed heterogeneous agent model for a single asset financial market with trading via an order book.

    PubMed

    Mota Navarro, Roberto; Larralde, Hernán

    2017-01-01

    We present an agent based model of a single asset financial market that is capable of replicating most of the non-trivial statistical properties observed in real financial markets, generically referred to as stylized facts. In our model agents employ strategies inspired on those used in real markets, and a realistic trade mechanism based on a double auction order book. We study the role of the distinct types of trader on the return statistics: specifically, correlation properties (or lack thereof), volatility clustering, heavy tails, and the degree to which the distribution can be described by a log-normal. Further, by introducing the practice of "profit taking", our model is also capable of replicating the stylized fact related to an asymmetry in the distribution of losses and gains.

  20. A detailed heterogeneous agent model for a single asset financial market with trading via an order book

    PubMed Central

    2017-01-01

    We present an agent based model of a single asset financial market that is capable of replicating most of the non-trivial statistical properties observed in real financial markets, generically referred to as stylized facts. In our model agents employ strategies inspired on those used in real markets, and a realistic trade mechanism based on a double auction order book. We study the role of the distinct types of trader on the return statistics: specifically, correlation properties (or lack thereof), volatility clustering, heavy tails, and the degree to which the distribution can be described by a log-normal. Further, by introducing the practice of “profit taking”, our model is also capable of replicating the stylized fact related to an asymmetry in the distribution of losses and gains. PMID:28245251

  1. Parametric study of the dynamic JWL-EOS for detonation products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urtiew, P.A.; Hayes, B.

    1990-03-01

    The JWL equation of state describing the adiabatic expansion of detonation products is revisited to complete the description of the principal eigenvalue, to reset the secondary eigenvalue to produce a well-behaved adiabatic gamma profile, and to normalize the characteristic equation of state in terms of conventional parameters having a clear experimental interpretation. This is accomplished by interjecting a dynamic flow condition concerning the value of the relative specific volume when the particle velocity of the detonation products is zero. In addition, a set of generic parameters based on the statistical distribution of the primary explosives making up the available datamore » base is presented. Unlike theoretical and statistical mechanical models, the adiabatic gamma function for these materials is seen to have a positive initial slope in accord with experimental findings. 10 refs., 4 figs.« less

  2. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    NASA Astrophysics Data System (ADS)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  3. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  4. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  5. Mechanism-based risk assessment strategy for drug-induced cholestasis using the transcriptional benchmark dose derived by toxicogenomics.

    PubMed

    Kawamoto, Taisuke; Ito, Yuichi; Morita, Osamu; Honda, Hiroshi

    2017-01-01

    Cholestasis is one of the major causes of drug-induced liver injury (DILI), which can result in withdrawal of approved drugs from the market. Early identification of cholestatic drugs is difficult due to the complex mechanisms involved. In order to develop a strategy for mechanism-based risk assessment of cholestatic drugs, we analyzed gene expression data obtained from the livers of rats that had been orally administered with 12 known cholestatic compounds repeatedly for 28 days at three dose levels. Qualitative analyses were performed using two statistical approaches (hierarchical clustering and principle component analysis), in addition to pathway analysis. The transcriptional benchmark dose (tBMD) and tBMD 95% lower limit (tBMDL) were used for quantitative analyses, which revealed three compound sub-groups that produced different types of differential gene expression; these groups of genes were mainly involved in inflammation, cholesterol biosynthesis, and oxidative stress. Furthermore, the tBMDL values for each test compound were in good agreement with the relevant no observed adverse effect level. These results indicate that our novel strategy for drug safety evaluation using mechanism-based classification and tBMDL would facilitate the application of toxicogenomics for risk assessment of cholestatic DILI.

  6. Simple measurement-based admission control for DiffServ access networks

    NASA Astrophysics Data System (ADS)

    Lakkakorpi, Jani

    2002-07-01

    In order to provide good Quality of Service (QoS) in a Differentiated Services (DiffServ) network, a dynamic admission control scheme is definitely needed as an alternative to overprovisioning. In this paper, we present a simple measurement-based admission control (MBAC) mechanism for DiffServ-based access networks. Instead of using active measurements only or doing purely static bookkeeping with parameter-based admission control (PBAC), the admission control decisions are based on bandwidth reservations and periodically measured & exponentially averaged link loads. If any link load on the path between two endpoints is over the applicable threshold, access is denied. Link loads are periodically sent to Bandwidth Broker (BB) of the routing domain, which makes the admission control decisions. The information needed in calculating the link loads is retrieved from the router statistics. The proposed admission control mechanism is verified through simulations. Our results prove that it is possible to achieve very high bottleneck link utilization levels and still maintain good QoS.

  7. Research on the correlation between corona current spectrum and audible noise spectrum of HVDC transmission line

    NASA Astrophysics Data System (ADS)

    Liu, Yingyi; Zhou, Lijuan; Liu, Yuanqing; Yuan, Haiwen; Ji, Liang

    2017-11-01

    Audible noise is closely related to corona current on a high voltage direct current (HVDC) transmission line. In this paper, we measured a large amount of audible noise and corona current waveforms simultaneously based on the largest outdoor HVDC corona cage all over the world. By analyzing the experimental data, the related statistical regularities between a corona current spectrum and an audible noise spectrum were obtained. Furthermore, the generation mechanism of audible noise was analyzed theoretically, and the related mathematical expression between the audible noise spectrum and the corona current spectrum, which is suitable for all of these measuring points in the space, has been established based on the electro-acoustic conversion theory. Finally, combined with the obtained mathematical relation, the internal reasons for these statistical regularities appearing in measured corona current and audible noise data were explained. The results of this paper not only present the statistical association regularities between the corona current spectrum and the audible noise spectrum on a HVDC transmission line, but also reveal the inherent reasons of these associated rules.

  8. Two Distinct Moral Mechanisms for Ascribing and Denying Intentionality.

    PubMed

    Ngo, Lawrence; Kelly, Meagan; Coutlee, Christopher G; Carter, R McKell; Sinnott-Armstrong, Walter; Huettel, Scott A

    2015-12-04

    Philosophers and legal scholars have long theorized about how intentionality serves as a critical input for morality and culpability, but the emerging field of experimental philosophy has revealed a puzzling asymmetry. People judge actions leading to negative consequences as being more intentional than those leading to positive ones. The implications of this asymmetry remain unclear because there is no consensus regarding the underlying mechanism. Based on converging behavioral and neural evidence, we demonstrate that there is no single underlying mechanism. Instead, two distinct mechanisms together generate the asymmetry. Emotion drives ascriptions of intentionality for negative consequences, while the consideration of statistical norms leads to the denial of intentionality for positive consequences. We employ this novel two-mechanism model to illustrate that morality can paradoxically shape judgments of intentionality. This is consequential for mens rea in legal practice and arguments in moral philosophy pertaining to terror bombing, abortion, and euthanasia among others.

  9. Two Distinct Moral Mechanisms for Ascribing and Denying Intentionality

    PubMed Central

    Ngo, Lawrence; Kelly, Meagan; Coutlee, Christopher G.; Carter, R. McKell; Sinnott-Armstrong, Walter; Huettel, Scott A.

    2015-01-01

    Philosophers and legal scholars have long theorized about how intentionality serves as a critical input for morality and culpability, but the emerging field of experimental philosophy has revealed a puzzling asymmetry. People judge actions leading to negative consequences as being more intentional than those leading to positive ones. The implications of this asymmetry remain unclear because there is no consensus regarding the underlying mechanism. Based on converging behavioral and neural evidence, we demonstrate that there is no single underlying mechanism. Instead, two distinct mechanisms together generate the asymmetry. Emotion drives ascriptions of intentionality for negative consequences, while the consideration of statistical norms leads to the denial of intentionality for positive consequences. We employ this novel two-mechanism model to illustrate that morality can paradoxically shape judgments of intentionality. This is consequential for mens rea in legal practice and arguments in moral philosophy pertaining to terror bombing, abortion, and euthanasia among others. PMID:26634909

  10. Cumulative (Dis)Advantage and the Matthew Effect in Life-Course Analysis

    PubMed Central

    Bask, Miia; Bask, Mikael

    2015-01-01

    To foster a deeper understanding of the mechanisms behind inequality in society, it is crucial to work with well-defined concepts associated with such mechanisms. The aim of this paper is to define cumulative (dis)advantage and the Matthew effect. We argue that cumulative (dis)advantage is an intra-individual micro-level phenomenon, that the Matthew effect is an inter-individual macro-level phenomenon and that an appropriate measure of the Matthew effect focuses on the mechanism or dynamic process that generates inequality. The Matthew mechanism is, therefore, a better name for the phenomenon, where we provide a novel measure of the mechanism, including a proof-of-principle analysis using disposable personal income data. Finally, because socio-economic theory should be able to explain cumulative (dis)advantage and the Matthew mechanism when they are detected in data, we discuss the types of models that may explain the phenomena. We argue that interactions-based models in the literature traditions of analytical sociology and statistical mechanics serve this purpose. PMID:26606386

  11. Structural differences in interictal migraine attack after epilepsy: A diffusion tensor imaging analysis.

    PubMed

    Huang, Qi; Lv, Xin; He, Yushuang; Wei, Xing; Ma, Meigang; Liao, Yuhan; Qin, Chao; Wu, Yuan

    2017-12-01

    Patients with epilepsy (PWE) are more likely to suffer from migraine attack, and aberrant white matter (WM) organization may be the mechanism underlying this phenomenon. This study aimed to use diffusion tensor imaging (DTI) technique to quantify WM structural differences in PWE with interictal migraine. Diffusion tensor imaging data were acquired in 13 PWE with migraine and 12 PWE without migraine. Diffusion metrics were analyzed using tract-atlas-based spatial statistics analysis. Atlas-based and tract-based spatial statistical analyses were conducted for robustness analysis. Correlation was explored between altered DTI metrics and clinical parameters. The main results are as follows: (i) Axonal damage plays a key role in PWE with interictal migraine. (ii) Significant diffusing alterations included higher fractional anisotropy (FA) in the fornix, higher mean diffusivity (MD) in the middle cerebellar peduncle (CP), left superior CP, and right uncinate fasciculus, and higher axial diffusivity (AD) in the middle CP and right medial lemniscus. (iii) Diffusion tensor imaging metrics has the tendency of correlation with seizure/migraine type and duration. Results indicate that characteristic structural impairments exist in PWE with interictal migraine. Epilepsy may contribute to migraine by altering WMs in the brain stem. White matter tracts in the fornix and right uncinate fasciculus also mediate migraine after epilepsy. This finding may improve our understanding of the pathological mechanisms underlying migraine attack after epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Do we treat our patients or rather periodontal microbes with adjunctive antibiotics in periodontal therapy? A 16S rDNA microbial community analysis.

    PubMed

    Hagenfeld, Daniel; Koch, Raphael; Jünemann, Sebastian; Prior, Karola; Harks, Inga; Eickholz, Peter; Hoffmann, Thomas; Kim, Ti-Sun; Kocher, Thomas; Meyle, Jörg; Kaner, Doğan; Schlagenhauf, Ulrich; Ehmke, Benjamin; Harmsen, Dag

    2018-01-01

    Empiric antibiotics are often used in combination with mechanical debridement to treat patients suffering from periodontitis and to eliminate disease-associated pathogens. Until now, only a few next generation sequencing 16S rDNA amplicon based publications with rather small sample sizes studied the effect of those interventions on the subgingival microbiome. Therefore, we studied subgingival samples of 89 patients with chronic periodontitis (solely non-smokers) before and two months after therapy. Forty-seven patients received mechanical periodontal therapy only, whereas 42 patients additionally received oral administered amoxicillin plus metronidazole (500 and 400 mg, respectively; 3x/day for 7 days). Samples were sequenced with Illumina MiSeq 300 base pairs paired end technology (V3 and V4 hypervariable regions of the 16S rDNA). Inter-group differences before and after therapy of clinical variables (percentage of sites with pocket depth ≥ 5mm, percentage of sites with bleeding on probing) and microbiome variables (diversity, richness, evenness, and dissimilarity) were calculated, a principal coordinate analysis (PCoA) was conducted, and differential abundance of agglomerated ribosomal sequence variants (aRSVs) classified on genus level was calculated using a negative binomial regression model. We found statistically noticeable decreased richness, and increased dissimilarity in the antibiotic, but not in the placebo group after therapy. The PCoA revealed a clear compositional separation of microbiomes after therapy in the antibiotic group, which could not be seen in the group receiving mechanical therapy only. This difference was even more pronounced on aRSV level. Here, adjunctive antibiotics were able to induce a microbiome shift by statistically noticeably reducing aRSVs belonging to genera containing disease-associated species, e.g., Porphyromonas, Tannerella, Treponema, and Aggregatibacter, and by noticeably increasing genera containing health-associated species. Mechanical therapy alone did not statistically noticeably affect any disease-associated taxa. Despite the difference in microbiome modulation both therapies improved the tested clinical parameters after two months. These results cast doubt on the relevance of the elimination and/or reduction of disease-associated taxa as a main goal of periodontal therapy.

  13. Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.

    PubMed

    Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

    2015-01-01

    Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended.

  14. Territorial Developments Based on Graffiti: a Statistical Mechanics Approach

    DTIC Science & Technology

    2011-10-28

    defined on a lattice . We introduce a two-gang Hamiltonian model where agents have red or blue affiliation but are otherwise indistinguishable. In this...ramifications of our results. Keywords: Territorial Formation, Spin Systems, Phase Transitions 1. Introduction Lattice models have been extensively used in...inconsequential. In short, lattice models have proved extremely useful in the context of the physical, biological and even chemical sciences. In more

  15. Capturing the Interaction Potential of Amyloidogenic Proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Javid, Nadeem; Vogtt, Karsten; Winter, Roland

    2007-07-13

    Experimentally derived static structure factors obtained for the aggregation-prone protein insulin were analyzed with a statistical mechanical model based on the Derjaguin-Landau-Verwey-Overbeek potential. The data reveal that the protein self-assembles into equilibrium clusters already at low concentrations. Furthermore, striking differences regarding interaction forces between aggregation-prone proteins such as insulin in the preaggregated regime and natively stable globular proteins are found.

  16. Interior Noise

    NASA Technical Reports Server (NTRS)

    Mixson, John S.; Wilby, John F.

    1991-01-01

    The generation and control of flight vehicle interior noise is discussed. Emphasis is placed on the mechanisms of transmission through airborne and structure-borne paths and the control of cabin noise by path modification. Techniques for identifying the relative contributions of the various source-path combinations are also discussed along with methods for the prediction of aircraft interior noise such as those based on the general modal theory and statistical energy analysis.

  17. Mechanical parameters and flight phase characteristics in aquatic plyometric jumping.

    PubMed

    Louder, Talin J; Searle, Cade J; Bressel, Eadric

    2016-09-01

    Plyometric jumping is a commonly prescribed method of training focused on the development of reactive strength and high-velocity concentric power. Literature suggests that aquatic plyometric training may be a low-impact, effective supplement to land-based training. The purpose of the present study was to quantify acute, biomechanical characteristics of the take-off and flight phase for plyometric movements performed in the water. Kinetic force platform data from 12 young, male adults were collected for counter-movement jumps performed on land and in water at two different immersion depths. The specificity of jumps between environmental conditions was assessed using kinetic measures, temporal characteristics, and an assessment of the statistical relationship between take-off velocity and time in the air. Greater peak mechanical power was observed for jumps performed in the water, and was influenced by immersion depth. Additionally, the data suggest that, in the water, the statistical relationship between take-off velocity and time in air is quadratic. Results highlight the potential application of aquatic plyometric training as a cross-training tool for improving mechanical power and suggest that water immersion depth and fluid drag play key roles in the specificity of the take-off phase for jumping movements performed in the water.

  18. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    NASA Astrophysics Data System (ADS)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  19. Competitive Processes in Cross-Situational Word Learning

    PubMed Central

    Yurovsky, Daniel; Yu, Chen; Smith, Linda B.

    2013-01-01

    Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. But the information that learners pick up from these regularities is dependent on their learning mechanism. This paper investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input, and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales – both within and across trials/situations –learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is then considered from the perspective of a process-level understanding of cross-situational learning. PMID:23607610

  20. Competitive processes in cross-situational word learning.

    PubMed

    Yurovsky, Daniel; Yu, Chen; Smith, Linda B

    2013-07-01

    Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. However, the information that learners pick up from these regularities is dependent on their learning mechanism. This article investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales-both within and across trials/situations-learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is considered from the perspective of a process-level understanding of cross-situational learning. Copyright © 2013 Cognitive Science Society, Inc.

  1. Small Systems and Limitations on the Use of Chemical Thermodynamics

    NASA Astrophysics Data System (ADS)

    Tovbin, Yu. K.

    2018-01-01

    Limitations on using chemical thermodynamics to describe small systems are formulated. These limitations follow from statistical mechanics for equilibrium and nonequilibrium processes and reflect (1) differences between characteristic relaxation times in momentum, energy, and mass transfer in different aggregate states of investigated systems; (2) achievements of statistical mechanics that allow us to determine criteria for the size of smallest region in which thermodynamics can be applied and the scale of the emergence of a new phase, along with criteria for the conditions of violating a local equilibrium. Based on this analysis, the main thermodynamic results are clarified: the phase rule for distorted interfaces, the sense and area of applicability of Gibbs's concept of passive forces, and the artificiality of Kelvin's equation as a result of limitations on the thermodynamic approach to considering small bodies. The wrongness of introducing molecular parameters into thermodynamic derivations, and the activity coefficient for an activated complex into the expression for a reaction rate constant, is demonstrated.

  2. Surface kinetic roughening caused by dental erosion: An atomic force microscopy study

    NASA Astrophysics Data System (ADS)

    Quartarone, Eliana; Mustarelli, Piercarlo; Poggio, Claudio; Lombardini, Marco

    2008-05-01

    Surface kinetic roughening takes place both in case of growth and erosion processes. Teeth surfaces are eroded by contact with acid drinks, such as those used to supplement mineral salts during sporting activities. Calcium-phosphate based (CPP-ACP) pastes are known to reduce the erosion process, and to favour the enamel remineralization. In this study we used atomic force microscopy (AFM) to investigate the surface roughening during dental erosion, and the mechanisms at the basis of the protection role exerted by a commercial CPP-ACP paste. We found a statistically significant difference (p<0.01) in the roughness of surfaces exposed and not exposed to the acid solutions. The treatment with the CPP-ACP paste determined a statistically significant reduction of the roughness values. By interpreting the AFM results in terms of fractal scaling concepts and continuum stochastic equations, we showed that the protection mechanism of the paste depends on the chemical properties of the acid solution.

  3. Fracture of disordered solids in compression as a critical phenomenon. I. Statistical mechanics formalism.

    PubMed

    Toussaint, Renaud; Pride, Steven R

    2002-09-01

    This is the first of a series of three articles that treats fracture localization as a critical phenomenon. This first article establishes a statistical mechanics based on ensemble averages when fluctuations through time play no role in defining the ensemble. Ensembles are obtained by dividing a huge rock sample into many mesoscopic volumes. Because rocks are a disordered collection of grains in cohesive contact, we expect that once shear strain is applied and cracks begin to arrive in the system, the mesoscopic volumes will have a wide distribution of different crack states. These mesoscopic volumes are the members of our ensembles. We determine the probability of observing a mesoscopic volume to be in a given crack state by maximizing Shannon's measure of the emergent-crack disorder subject to constraints coming from the energy balance of brittle fracture. The laws of thermodynamics, the partition function, and the quantification of temperature are obtained for such cracking systems.

  4. An argument for mechanism-based statistical inference in cancer

    PubMed Central

    Ochs, Michael; Price, Nathan D.; Tomasetti, Cristian; Younes, Laurent

    2015-01-01

    Cancer is perhaps the prototypical systems disease, and as such has been the focus of extensive study in quantitative systems biology. However, translating these programs into personalized clinical care remains elusive and incomplete. In this perspective, we argue that realizing this agenda—in particular, predicting disease phenotypes, progression and treatment response for individuals—requires going well beyond standard computational and bioinformatics tools and algorithms. It entails designing global mathematical models over network-scale configurations of genomic states and molecular concentrations, and learning the model parameters from limited available samples of high-dimensional and integrative omics data. As such, any plausible design should accommodate: biological mechanism, necessary for both feasible learning and interpretable decision making; stochasticity, to deal with uncertainty and observed variation at many scales; and a capacity for statistical inference at the patient level. This program, which requires a close, sustained collaboration between mathematicians and biologists, is illustrated in several contexts, including learning bio-markers, metabolism, cell signaling, network inference and tumorigenesis. PMID:25381197

  5. A Conditional Curie-Weiss Model for Stylized Multi-group Binary Choice with Social Interaction

    NASA Astrophysics Data System (ADS)

    Opoku, Alex Akwasi; Edusei, Kwame Owusu; Ansah, Richard Kwame

    2018-04-01

    This paper proposes a conditional Curie-Weiss model as a model for decision making in a stylized society made up of binary decision makers that face a particular dichotomous choice between two options. Following Brock and Durlauf (Discrete choice with social interaction I: theory, 1955), we set-up both socio-economic and statistical mechanical models for the choice problem. We point out when both the socio-economic and statistical mechanical models give rise to the same self-consistent equilibrium mean choice level(s). Phase diagram of the associated statistical mechanical model and its socio-economic implications are discussed.

  6. Ballistic and diffusive dynamics in a two-dimensional ideal gas of macroscopic chaotic Faraday waves.

    PubMed

    Welch, Kyle J; Hastings-Hauss, Isaac; Parthasarathy, Raghuveer; Corwin, Eric I

    2014-04-01

    We have constructed a macroscopic driven system of chaotic Faraday waves whose statistical mechanics, we find, are surprisingly simple, mimicking those of a thermal gas. We use real-time tracking of a single floating probe, energy equipartition, and the Stokes-Einstein relation to define and measure a pseudotemperature and diffusion constant and then self-consistently determine a coefficient of viscous friction for a test particle in this pseudothermal gas. Because of its simplicity, this system can serve as a model for direct experimental investigation of nonequilibrium statistical mechanics, much as the ideal gas epitomizes equilibrium statistical mechanics.

  7. Sources and characteristics of acoustic emissions from mechanically stressed geologic granular media — A review

    NASA Astrophysics Data System (ADS)

    Michlmayr, Gernot; Cohen, Denis; Or, Dani

    2012-05-01

    The formation of cracks and emergence of shearing planes and other modes of rapid macroscopic failure in geologic granular media involve numerous grain scale mechanical interactions often generating high frequency (kHz) elastic waves, referred to as acoustic emissions (AE). These acoustic signals have been used primarily for monitoring and characterizing fatigue and progressive failure in engineered systems, with only a few applications concerning geologic granular media reported in the literature. Similar to the monitoring of seismic events preceding an earthquake, AE may offer a means for non-invasive, in-situ, assessment of mechanical precursors associated with imminent landslides or other types of rapid mass movements (debris flows, rock falls, snow avalanches, glacier stick-slip events). Despite diverse applications and potential usefulness, a systematic description of the AE method and its relevance to mechanical processes in Earth sciences is lacking. This review is aimed at providing a sound foundation for linking observed AE with various micro-mechanical failure events in geologic granular materials, not only for monitoring of triggering events preceding mass mobilization, but also as a non-invasive tool in its own right for probing the rich spectrum of mechanical processes at scales ranging from a single grain to a hillslope. We review first studies reporting use of AE for monitoring of failure in various geologic materials, and describe AE generating source mechanisms in mechanically stressed geologic media (e.g., frictional sliding, micro-crackling, particle collisions, rupture of water bridges, etc.) including AE statistical features, such as frequency content and occurrence probabilities. We summarize available AE sensors and measurement principles. The high sampling rates of advanced AE systems enable detection of numerous discrete failure events within a volume and thus provide access to statistical descriptions of progressive collapse of systems with many interacting mechanical elements such as the fiber bundle model (FBM). We highlight intrinsic links between AE characteristics and established statistical models often used in structural engineering and material sciences, and outline potential applications for failure prediction and early-warning using the AE method in combination with the FBM. The biggest challenge to application of the AE method for field applications is strong signal attenuation. We provide an outlook for overcoming such limitations considering emergence of a class of fiber-optic based distributed AE sensors and deployment of acoustic waveguides as part of monitoring networks.

  8. Statistical use of argonaute expression and RISC assembly in microRNA target identification.

    PubMed

    Stanhope, Stephen A; Sengupta, Srikumar; den Boon, Johan; Ahlquist, Paul; Newton, Michael A

    2009-09-01

    MicroRNAs (miRNAs) posttranscriptionally regulate targeted messenger RNAs (mRNAs) by inducing cleavage or otherwise repressing their translation. We address the problem of detecting m/miRNA targeting relationships in homo sapiens from microarray data by developing statistical models that are motivated by the biological mechanisms used by miRNAs. The focus of our modeling is the construction, activity, and mediation of RNA-induced silencing complexes (RISCs) competent for targeted mRNA cleavage. We demonstrate that regression models accommodating RISC abundance and controlling for other mediating factors fit the expression profiles of known target pairs substantially better than models based on m/miRNA expressions alone, and lead to verifications of computational target pair predictions that are more sensitive than those based on marginal expression levels. Because our models are fully independent of exogenous results from sequence-based computational methods, they are appropriate for use as either a primary or secondary source of information regarding m/miRNA target pair relationships, especially in conjunction with high-throughput expression studies.

  9. Statistical-mechanical predictions and Navier-Stokes dynamics of two-dimensional flows on a bounded domain.

    PubMed

    Brands, H; Maassen, S R; Clercx, H J

    1999-09-01

    In this paper the applicability of a statistical-mechanical theory to freely decaying two-dimensional (2D) turbulence on a bounded domain is investigated. We consider an ensemble of direct numerical simulations in a square box with stress-free boundaries, with a Reynolds number that is of the same order as in experiments on 2D decaying Navier-Stokes turbulence. The results of these simulations are compared with the corresponding statistical equilibria, calculated from different stages of the evolution. It is shown that the statistical equilibria calculated from early times of the Navier-Stokes evolution do not correspond to the dynamical quasistationary states. At best, the global topological structure is correctly predicted from a relatively late time in the Navier-Stokes evolution, when the quasistationary state has almost been reached. This failure of the (basically inviscid) statistical-mechanical theory is related to viscous dissipation and net leakage of vorticity in the Navier-Stokes dynamics at moderate values of the Reynolds number.

  10. A laboratory evaluation of the influence of weighing gauges performance on extreme events statistics

    NASA Astrophysics Data System (ADS)

    Colli, Matteo; Lanza, Luca

    2014-05-01

    The effects of inaccurate ground based rainfall measurements on the information derived from rain records is yet not much documented in the literature. La Barbera et al. (2002) investigated the propagation of the systematic mechanic errors of tipping bucket type rain gauges (TBR) into the most common statistics of rainfall extremes, e.g. in the assessment of the return period T (or the related non-exceedance probability) of short-duration/high intensity events. Colli et al. (2012) and Lanza et al. (2012) extended the analysis to a 22-years long precipitation data set obtained from a virtual weighing type gauge (WG). The artificial WG time series was obtained basing on real precipitation data measured at the meteo-station of the University of Genova and modelling the weighing gauge output as a linear dynamic system. This approximation was previously validated with dedicated laboratory experiments and is based on the evidence that the accuracy of WG measurements under real world/time varying rainfall conditions is mainly affected by the dynamic response of the gauge (as revealed during the last WMO Field Intercomparison of Rainfall Intensity Gauges). The investigation is now completed by analyzing actual measurements performed by two common weighing gauges, the OTT Pluvio2 load-cell gauge and the GEONOR T-200 vibrating-wire gauge, since both these instruments demonstrated very good performance under previous constant flow rate calibration efforts. A laboratory dynamic rainfall generation system has been arranged and validated in order to simulate a number of precipitation events with variable reference intensities. Such artificial events were generated basing on real world rainfall intensity (RI) records obtained from the meteo-station of the University of Genova so that the statistical structure of the time series is preserved. The influence of the WG RI measurements accuracy on the associated extreme events statistics is analyzed by comparing the original intensity-duration-frequency (IDF) curves with those obtained from the measuring of the simulated rain events. References: Colli, M., L.G. Lanza, and P. La Barbera, (2012). Weighing gauges measurement errors and the design rainfall for urban scale applications, 9th International Workshop On Precipitation In Urban Areas, 6-9 December, 2012, St. Moritz, Switzerland Lanza, L.G., M. Colli, and P. La Barbera (2012). On the influence of rain gauge performance on extreme events statistics: the case of weighing gauges, EGU General Assembly 2012, April 22th, Wien, Austria La Barbera, P., L.G. Lanza, and L. Stagi, (2002). Influence of systematic mechanical errors of tipping-bucket rain gauges on the statistics of rainfall extremes. Water Sci. Techn., 45(2), 1-9.

  11. Hybrid PSO-ASVR-based method for data fitting in the calibration of infrared radiometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Sen; Li, Chengwei, E-mail: heikuanghit@163.com

    2016-06-15

    The present paper describes a hybrid particle swarm optimization-adaptive support vector regression (PSO-ASVR)-based method for data fitting in the calibration of infrared radiometer. The proposed hybrid PSO-ASVR-based method is based on PSO in combination with Adaptive Processing and Support Vector Regression (SVR). The optimization technique involves setting parameters in the ASVR fitting procedure, which significantly improves the fitting accuracy. However, its use in the calibration of infrared radiometer has not yet been widely explored. Bearing this in mind, the PSO-ASVR-based method, which is based on the statistical learning theory, is successfully used here to get the relationship between the radiationmore » of a standard source and the response of an infrared radiometer. Main advantages of this method are the flexible adjustment mechanism in data processing and the optimization mechanism in a kernel parameter setting of SVR. Numerical examples and applications to the calibration of infrared radiometer are performed to verify the performance of PSO-ASVR-based method compared to conventional data fitting methods.« less

  12. Optimization of metabolite detection by quantum mechanics simulations in magnetic resonance spectroscopy.

    PubMed

    Gambarota, Giulio

    2017-07-15

    Magnetic resonance spectroscopy (MRS) is a well established modality for investigating tissue metabolism in vivo. In recent years, many efforts by the scientific community have been directed towards the improvement of metabolite detection and quantitation. Quantum mechanics simulations allow for investigations of the MR signal behaviour of metabolites; thus, they provide an essential tool in the optimization of metabolite detection. In this review, we will examine quantum mechanics simulations based on the density matrix formalism. The density matrix was introduced by von Neumann in 1927 to take into account statistical effects within the theory of quantum mechanics. We will discuss the main steps of the density matrix simulation of an arbitrary spin system and show some examples for the strongly coupled two spin system. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. On System Engineering a Barter-Based Re-allocation of Space System Key Development Resources

    NASA Astrophysics Data System (ADS)

    Kosmann, William J.

    NASA has had a decades-long problem with cost growth during the development of space science missions. Numerous agency-sponsored studies have produced average mission level development cost growths ranging from 23 to 77%. A new study of 26 historical NASA science instrument set developments using expert judgment to re-allocate key development resources has an average cost growth of 73.77%. Twice in history, during the Cassini and EOS-Terra science instrument developments, a barter-based mechanism has been used to re-allocate key development resources. The mean instrument set development cost growth was -1.55%. Performing a bivariate inference on the means of these two distributions, there is statistical evidence to support the claim that using a barter-based mechanism to re-allocate key instrument development resources will result in a lower expected cost growth than using the expert judgment approach. Agent-based discrete event simulation is the natural way to model a trade environment. A NetLogo agent-based barter-based simulation of science instrument development was created. The agent-based model was validated against the Cassini historical example, as the starting and ending instrument development conditions are available. The resulting validated agent-based barter-based science instrument resource re-allocation simulation was used to perform 300 instrument development simulations, using barter to re-allocate development resources. The mean cost growth was -3.365%. A bivariate inference on the means was performed to determine that additional significant statistical evidence exists to support a claim that using barter-based resource re-allocation will result in lower expected cost growth, with respect to the historical expert judgment approach. Barter-based key development resource re-allocation should work on science spacecraft development as well as it has worked on science instrument development. A new study of 28 historical NASA science spacecraft developments has an average cost growth of 46.04%. As barter-based key development resource re-allocation has never been tried in a spacecraft development, no historical results exist, and an inference on the means test is not possible. A simulation of using barter-based resource re-allocation should be developed. The NetLogo instrument development simulation should be modified to account for spacecraft development market participant differences. The resulting agent-based barter-based spacecraft resource re-allocation simulation would then be used to determine if significant statistical evidence exists to prove a claim that using barter-based resource re-allocation will result in lower expected cost growth.

  14. Drug-Induced Dental Caries: A Disproportionality Analysis Using Data from VigiBase.

    PubMed

    de Campaigno, Emilie Patras; Kebir, Inès; Montastruc, Jean-Louis; Rueter, Manuela; Maret, Delphine; Lapeyre-Mestre, Maryse; Sallerin, Brigitte; Despas, Fabien

    2017-12-01

    Dental caries is defined as a pathological breakdown of the tooth. It is an infectious phenomenon involving a multifactorial aetiology. The impact of drugs on cariogenic risk has been poorly investigated. In this study, we identified drugs suspected to induce dental caries as adverse drug reactions (ADRs) and then studied a possible pathogenic mechanism for each drug that had a statistically significant disproportionality. We extracted individual case safety reports of dental caries associated with drugs from VigiBase ® (the World Health Organization global individual case safety report database). We calculated disproportionality for each drug with a reporting odds ratio (ROR) and 99% confidence interval. We analysed the pharmacodynamics of each drug that had a statistically significant disproportionality. In VigiBase ® , 5229 safety reports for dental caries concerning 733 drugs were identified. Among these drugs, 88 had a significant ROR, and for 65 of them (73.9%), no information about dental caries was found in the summaries of the product characteristics, the Micromedex ® DRUGDEX, or the Martindale databases. Regarding the pharmacological classes of drugs involved in dental caries, we identified bisphosphonates, atropinic drugs, antidepressants, corticoids, immunomodulating drugs, antipsychotics, antiepileptics, opioids and β 2 -adrenoreceptor agonist drugs. Regarding possible pathogenic mechanisms for these drugs, we identified changes in salivary flow/composition for 54 drugs (61.4%), bone metabolism changes for 31 drugs (35.2%), hyperglycaemia for 32 drugs (36.4%) and/or immunosuppression for 23 drugs (26.1%). For nine drugs (10.2%), the mechanism was unclear. We identified 88 drugs with a significant positive disproportionality for dental caries. Special attention has to be paid to bisphosphonates, atropinic drugs, immunosuppressants and drugs causing hyperglycaemia.

  15. Bag-breakup control of surface drag in hurricanes

    NASA Astrophysics Data System (ADS)

    Troitskaya, Yuliya; Zilitinkevich, Sergej; Kandaurov, Alexander; Ermakova, Olga; Kozlov, Dmitry; Sergeev, Daniil

    2016-04-01

    Air-sea interaction at extreme winds is of special interest now in connection with the problem of the sea surface drag reduction at the wind speed exceeding 30-35 m/s. This phenomenon predicted by Emanuel (1995) and confirmed by a number of field (e.g., Powell, et al, 2003) and laboratory (Donelan et al, 2004) experiments still waits its physical explanation. Several papers attributed the drag reduction to spume droplets - spray turning off the crests of breaking waves (e.g., Kudryavtsev, Makin, 2011, Bao, et al, 2011). The fluxes associated with the spray are determined by the rate of droplet production at the surface quantified by the sea spray generation function (SSGF), defined as the number of spray particles of radius r produced from the unit area of water surface in unit time. However, the mechanism of spume droplets' formation is unknown and empirical estimates of SSGF varied over six orders of magnitude; therefore, the production rate of large sea spray droplets is not adequately described and there are significant uncertainties in estimations of exchange processes in hurricanes. Herewith, it is unknown what is air-sea interface and how water is fragmented to spray at hurricane wind. Using high-speed video, we observed mechanisms of production of spume droplets at strong winds by high-speed video filming, investigated statistics and compared their efficiency. Experiments showed, that the generation of the spume droplets near the wave crest is caused by the following events: bursting of submerged bubbles, generation and breakup of "projections" and "bag breakup". Statistical analysis of results of these experiments showed that the main mechanism of spray-generation is attributed to "bag-breakup mechanism", namely, inflating and consequent blowing of short-lived, sail-like pieces of the water-surface film. Using high-speed video, we show that at hurricane winds the main mechanism of spray production is attributed to "bag-breakup", namely, inflating and consequent breaking of short-lived, sail-like pieces of the water-surface film - "bags". On the base of general principles of statistical physics (model of a canonical ensemble) we developed statistics of the "bag-breakup" events: their number and statistical distribution of geometrical parameters depending on wind speed. Basing on the developed statistics, we estimated the surface stress caused by bags as the average sum of stresses caused by individual bags depending on their eometrical parameters. The resulting stress is subjected to counteracting impacts of the increasing wind speed: the increasing number of bags, and their decreasing sizes and life times and the balance yields a peaking dependence of the bag resistance on the wind speed: the share of bag-stress peaks at U10  35 m/s and then reduces. Peaking of surface stress associated with the "bag-breakup" explains seemingly paradoxical non-monotonous wind-dependence of surface drag coefficient peaking at winds about 35 m/s. This work was supported by the Russian Foundation of Basic Research (14-05-91767, 13-05-12093, 16-05-00839, 14-05-91767, 16-55-52025, 15-35-20953) and experiment and equipment was supported by Russian Science Foundation (Agreements 14-17-00667 and 15-17-20009 respectively), Yu.Troitskaya, A.Kandaurov and D.Sergeev were partially supported by FP7 Collaborative Project No. 612610.

  16. [Cases and duration of mechanical ventilation in German hospitals : An analysis of DRG incentives and developments in respiratory medicine].

    PubMed

    Biermann, A; Geissler, A

    2016-09-01

    Diagnosis-related groups (DRGs) have been used to reimburse hospitals services in Germany since 2003/04. Like any other reimbursement system, DRGs offer specific incentives for hospitals that may lead to unintended consequences for patients. In the German context, specific procedures and their documentation are suspected to be primarily performed to increase hospital revenues. Mechanical ventilation of patients and particularly the duration of ventilation, which is an important variable for the DRG-classification, are often discussed to be among these procedures. The aim of this study was to examine incentives created by the German DRG-based payment system with regard to mechanical ventilation and to identify factors that explain the considerable increase of mechanically ventilated patients in recent years. Moreover, the assumption that hospitals perform mechanical ventilation in order to gain economic benefits was examined. In order to gain insights on the development of the number of mechanically ventilated patients, patient-level data provided by the German Federal Statistical Office and the German Institute for the Hospital Remuneration System were analyzed. The type of performed ventilation, the total number of ventilation hours, the age distribution, mortality and the DRG distribution for mechanical ventilation were calculated, using methods of descriptive and inferential statistics. Furthermore, changes in DRG-definitions and changes in respiratory medicine were compared for the years 2005-2012. Since the introduction of the DRG-based payment system in Germany, the hours of ventilation and the number of mechanically ventilated patients have substantially increased, while mortality has decreased. During the same period there has been a switch to less invasive ventilation methods. The age distribution has shifted to higher age-groups. A ventilation duration determined by DRG definitions could not be found. Due to advances in respiratory medicine, new ventilation methods have been introduced that are less prone to complications. This development has simultaneously improved survival rates. There was no evidence supporting the assumption that the duration of mechanical ventilation is influenced by the time intervals relevant for DRG grouping. However, presumably operational routines such as staff availability within early and late shifts of the hospital have a significant impact on the termination of mechanical ventilation.

  17. Treatment of missing data in follow-up studies of randomised controlled trials: A systematic review of the literature.

    PubMed

    Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-08-01

    After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.

  18. Scout trajectory error propagation computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1982-01-01

    Since 1969, flight experience has been used as the basis for predicting Scout orbital accuracy. The data used for calculating the accuracy consists of errors in the trajectory parameters (altitude, velocity, etc.) at stage burnout as observed on Scout flights. Approximately 50 sets of errors are used in Monte Carlo analysis to generate error statistics in the trajectory parameters. A covariance matrix is formed which may be propagated in time. The mechanization of this process resulted in computer program Scout Trajectory Error Propagation (STEP) and is described herein. Computer program STEP may be used in conjunction with the Statistical Orbital Analysis Routine to generate accuracy in the orbit parameters (apogee, perigee, inclination, etc.) based upon flight experience.

  19. Deterministic Impulsive Vacuum Foundations for Quantum-Mechanical Wavefunctions

    NASA Astrophysics Data System (ADS)

    Valentine, John S.

    2013-09-01

    By assuming that a fermion de-constitutes immediately at source, that its constituents, as bosons, propagate uniformly as scalar vacuum terms with phase (radial) symmetry, and that fermions are unique solutions for specific phase conditions, we find a model that self-quantizes matter from continuous waves, unifying bosons and fermion ontologies in a single basis, in a constitution-invariant process. Vacuum energy has a wavefunction context, as a mass-energy term that enables wave collapse and increases its amplitude, with gravitational field as the gradient of the flux density. Gravitational and charge-based force effects emerge as statistics without special treatment. Confinement, entanglement, vacuum statistics, forces, and wavefunction terms emerge from the model's deterministic foundations.

  20. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  1. Statistical mechanics of multipartite entanglement

    NASA Astrophysics Data System (ADS)

    Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.

    2009-02-01

    We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over all balanced bipartitions. We search for those (maximally multipartite entangled) states whose purity is minimum for all bipartitions and recast this optimization problem into a problem of statistical mechanics.

  2. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  3. Orbital Roof Fractures as an Indicator for Concomitant Ocular Injury

    DTIC Science & Technology

    2017-11-12

    between these two groups to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and exam findings, and...using Pearson’s x2 test or Fisher’s exact test to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and

  4. Uniting statistical and individual-based approaches for animal movement modelling.

    PubMed

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  5. Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling

    PubMed Central

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047

  6. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  7. Nonequilibrium electromagnetics: Local and macroscopic fields and constitutive relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker-Jarvis, James; Kabos, Pavel; Holloway, Christopher L.

    We study the electrodynamics of materials using a Liouville-Hamiltonian-based statistical-mechanical theory. Our goal is to develop electrodynamics from an ensemble-average viewpoint that is valid for microscopic and nonequilibrium systems at molecular to submolecular scales. This approach is not based on a Taylor series expansion of the charge density to obtain the multipoles. Instead, expressions of the molecular multipoles are used in an inverse problem to obtain the averaging statistical-density function that is used to obtain the macroscopic fields. The advantages of this method are that the averaging function is constructed in a self-consistent manner and the molecules can either bemore » treated as point multipoles or contain more microstructure. Expressions for the local and macroscopic fields are obtained, and evolution equations for the constitutive parameters are developed. We derive equations for the local field as functions of the applied, polarization, magnetization, strain density, and macroscopic fields.« less

  8. Inter-occurrence times and universal laws in finance, earthquakes and genomes

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2016-07-01

    A plethora of natural, artificial and social systems exist which do not belong to the Boltzmann-Gibbs (BG) statistical-mechanical world, based on the standard additive entropy $S_{BG}$ and its associated exponential BG factor. Frequent behaviors in such complex systems have been shown to be closely related to $q$-statistics instead, based on the nonadditive entropy $S_q$ (with $S_1=S_{BG}$), and its associated $q$-exponential factor which generalizes the usual BG one. In fact, a wide range of phenomena of quite different nature exist which can be described and, in the simplest cases, understood through analytic (and explicit) functions and probability distributions which exhibit some universal features. Universality classes are concomitantly observed which can be characterized through indices such as $q$. We will exhibit here some such cases, namely concerning the distribution of inter-occurrence (or inter-event) times in the areas of finance, earthquakes and genomes.

  9. Integrated Assessment and Improvement of the Quality Assurance System for the Cosworth Casting Process

    NASA Astrophysics Data System (ADS)

    Yousif, Dilon

    The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).

  10. Automatic stage identification of Drosophila egg chamber based on DAPI images

    PubMed Central

    Jia, Dongyu; Xu, Qiuping; Xie, Qian; Mio, Washington; Deng, Wu-Min

    2016-01-01

    The Drosophila egg chamber, whose development is divided into 14 stages, is a well-established model for developmental biology. However, visual stage determination can be a tedious, subjective and time-consuming task prone to errors. Our study presents an objective, reliable and repeatable automated method for quantifying cell features and classifying egg chamber stages based on DAPI images. The proposed approach is composed of two steps: 1) a feature extraction step and 2) a statistical modeling step. The egg chamber features used are egg chamber size, oocyte size, egg chamber ratio and distribution of follicle cells. Methods for determining the on-site of the polytene stage and centripetal migration are also discussed. The statistical model uses linear and ordinal regression to explore the stage-feature relationships and classify egg chamber stages. Combined with machine learning, our method has great potential to enable discovery of hidden developmental mechanisms. PMID:26732176

  11. Mechanisms of ketamine on mice hippocampi shown by gas chromatography-mass spectrometry-based metabolomic analysis.

    PubMed

    Lian, Bin; Xia, Jinjun; Yang, Xun; Zhou, Chanjuan; Gong, Xue; Gui, Siwen; Mao, Qiang; Wang, Ling; Li, Pengfei; Huang, Cheng; Qi, Xunzhong; Xie, Peng

    2018-06-13

    In the present study, we used a gas chromatography-mass spectrometry-based metabolomics method to evaluate the effects of ketamine on mice hippocampi. Multivariate statistical analysis and ingenuity pathway analysis were then used to identify and explore the potential mechanisms and biofunction of ketamine. Compared with the control (CON) group, 14 differential metabolites that involved amino acid metabolism, energy metabolism, and oxidative stress metabolism were identified. After combination with 2,3-dihydroxy-6-nitro-7-sulfamoyl-benzo[f]quinoxaline-2,3-dione (NBQX) administration, six of the 14 metabolites remained significantly differentially expressed between the ketamine (KET) and KET+NBQX groups, including glycine, alanine, glutamine, aspartic acid, myoinositol, and ascorbate, whereas no difference was found in the levels of the other eight metabolites between the KET and KET+NBQX groups, including phosphate, 4-aminobutyric acid, urea, creatine, L-malic acid, galactinol, inosine, and aminomalonic. Our findings indicate that ketamine exerts antidepressant effects through an α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid inhibition-dependent mechanism and a mechanism not affected by α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid inhibition. Which provides further insight into the therapeutic mechanisms of ketamine in the hippocampus.

  12. Performance evaluation of the machine learning algorithms used in inference mechanism of a medical decision support system.

    PubMed

    Bal, Mert; Amasyali, M Fatih; Sever, Hayri; Kose, Guven; Demirhan, Ayse

    2014-01-01

    The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets.

  13. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  14. Interactions between statistical and semantic information in infant language development

    PubMed Central

    Lany, Jill; Saffran, Jenny R.

    2013-01-01

    Infants can use statistical regularities to form rudimentary word categories (e.g. noun, verb), and to learn the meanings common to words from those categories. Using an artificial language methodology, we probed the mechanisms by which two types of statistical cues (distributional and phonological regularities) affect word learning. Because linking distributional cues vs. phonological information to semantics make different computational demands on learners, we also tested whether their use is related to language proficiency. We found that 22-month-old infants with smaller vocabularies generalized using phonological cues; however, infants with larger vocabularies showed the opposite pattern of results, generalizing based on distributional cues. These findings suggest that both phonological and distributional cues marking word categories promote early word learning. Moreover, while correlations between these cues are important to forming word categories, we found infants’ weighting of these cues in subsequent word-learning tasks changes over the course of early language development. PMID:21884336

  15. [Micro-simulation of firms' heterogeneity on pollution intensity and regional characteristics].

    PubMed

    Zhao, Nan; Liu, Yi; Chen, Ji-Ning

    2009-11-01

    In the same industrial sector, heterogeneity of pollution intensity exists among firms. There are some errors if using sector's average pollution intensity, which are calculated by limited number of firms in environmental statistic database to represent the sector's regional economic-environmental status. Based on the production function which includes environmental depletion as input, a micro-simulation model on firms' operational decision making is proposed. Then the heterogeneity of firms' pollution intensity can be mechanically described. Taking the mechanical manufacturing sector in Deyang city, 2005 as the case, the model's parameters were estimated. And the actual COD emission intensities of environmental statistic firms can be properly matched by the simulation. The model's results also show that the regional average COD emission intensity calculated by the environmental statistic firms (0.002 6 t per 10 000 yuan fixed asset, 0.001 5 t per 10 000 yuan production value) is lower than the regional average intensity calculated by all the firms in the region (0.003 0 t per 10 000 yuan fixed asset, 0.002 3 t per 10 000 yuan production value). The difference among average intensities in the six counties is significant as well. These regional characteristics of pollution intensity attribute to the sector's inner-structure (firms' scale distribution, technology distribution) and its spatial deviation.

  16. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    PubMed

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  17. A statistical framework for applying RNA profiling to chemical hazard detection.

    PubMed

    Kostich, Mitchell S

    2017-12-01

    Use of 'omics technologies in environmental science is expanding. However, application is mostly restricted to characterizing molecular steps leading from toxicant interaction with molecular receptors to apical endpoints in laboratory species. Use in environmental decision-making is limited, due to difficulty in elucidating mechanisms in sufficient detail to make quantitative outcome predictions in any single species or in extending predictions to aquatic communities. Here we introduce a mechanism-agnostic statistical approach, supplementing mechanistic investigation by allowing probabilistic outcome prediction even when understanding of molecular pathways is limited, and facilitating extrapolation from results in laboratory test species to predictions about aquatic communities. We use concepts familiar to environmental managers, supplemented with techniques employed for clinical interpretation of 'omics-based biomedical tests. We describe the framework in step-wise fashion, beginning with single test replicates of a single RNA variant, then extending to multi-gene RNA profiling, collections of test replicates, and integration of complementary data. In order to simplify the presentation, we focus on using RNA profiling for distinguishing presence versus absence of chemical hazards, but the principles discussed can be extended to other types of 'omics measurements, multi-class problems, and regression. We include a supplemental file demonstrating many of the concepts using the open source R statistical package. Published by Elsevier Ltd.

  18. A DFT and semiempirical model-based study of opioid receptor affinity and selectivity in a group of molecules with a morphine structural core.

    PubMed

    Bruna-Larenas, Tamara; Gómez-Jeria, Juan S

    2012-01-01

    We report the results of a search for model-based relationships between mu, delta, and kappa opioid receptor binding affinity and molecular structure for a group of molecules having in common a morphine structural core. The wave functions and local reactivity indices were obtained at the ZINDO/1 and B3LYP/6-31G(∗∗) levels of theory for comparison. New developments in the expression for the drug-receptor interaction energy expression allowed several local atomic reactivity indices to be included, such as local electronic chemical potential, local hardness, and local electrophilicity. These indices, together with a new proposal for the ordering of the independent variables, were incorporated in the statistical study. We found and discussed several statistically significant relationships for mu, delta, and kappa opioid receptor binding affinity at both levels of theory. Some of the new local reactivity indices incorporated in the theory appear in several equations for the first time in the history of model-based equations. Interaction pharmacophores were generated for mu, delta, and kappa receptors. We discuss possible differences regulating binding and selectivity in opioid receptor subtypes. This study, contrarily to the statistically backed ones, is able to provide a microscopic insight of the mechanisms involved in the binding process.

  19. STEP-TRAMM - A modeling interface for simulating localized rainfall induced shallow landslides and debris flow runout pathways

    NASA Astrophysics Data System (ADS)

    Or, D.; von Ruette, J.; Lehmann, P.

    2017-12-01

    Landslides and subsequent debris-flows initiated by rainfall represent a common natural hazard in mountainous regions. We integrated a landslide hydro-mechanical triggering model with a simple model for debris flow runout pathways and developed a graphical user interface (GUI) to represent these natural hazards at catchment scale at any location. The STEP-TRAMM GUI provides process-based estimates of the initiation locations and sizes of landslides patterns based on digital elevation models (SRTM) linked with high resolution global soil maps (SoilGrids 250 m resolution) and satellite based information on rainfall statistics for the selected region. In the preprocessing phase the STEP-TRAMM model estimates soil depth distribution to supplement other soil information for delineating key hydrological and mechanical properties relevant to representing local soil failure. We will illustrate this publicly available GUI and modeling platform to simulate effects of deforestation on landslide hazards in several regions and compare model outcome with satellite based information.

  20. Quantum mechanics of black holes.

    PubMed

    Witten, Edward

    2012-08-03

    The popular conception of black holes reflects the behavior of the massive black holes found by astronomers and described by classical general relativity. These objects swallow up whatever comes near and emit nothing. Physicists who have tried to understand the behavior of black holes from a quantum mechanical point of view, however, have arrived at quite a different picture. The difference is analogous to the difference between thermodynamics and statistical mechanics. The thermodynamic description is a good approximation for a macroscopic system, but statistical mechanics describes what one will see if one looks more closely.

  1. In vivo serial MRI-based models and statistical methods to quantify sensitivity and specificity of mechanical predictors for carotid plaque rupture: location and beyond.

    PubMed

    Wu, Zheyang; Yang, Chun; Tang, Dalin

    2011-06-01

    It has been hypothesized that mechanical risk factors may be used to predict future atherosclerotic plaque rupture. Truly predictive methods for plaque rupture and methods to identify the best predictor(s) from all the candidates are lacking in the literature. A novel combination of computational and statistical models based on serial magnetic resonance imaging (MRI) was introduced to quantify sensitivity and specificity of mechanical predictors to identify the best candidate for plaque rupture site prediction. Serial in vivo MRI data of carotid plaque from one patient was acquired with follow-up scan showing ulceration. 3D computational fluid-structure interaction (FSI) models using both baseline and follow-up data were constructed and plaque wall stress (PWS) and strain (PWSn) and flow maximum shear stress (FSS) were extracted from all 600 matched nodal points (100 points per matched slice, baseline matching follow-up) on the lumen surface for analysis. Each of the 600 points was marked "ulcer" or "nonulcer" using follow-up scan. Predictive statistical models for each of the seven combinations of PWS, PWSn, and FSS were trained using the follow-up data and applied to the baseline data to assess their sensitivity and specificity using the 600 data points for ulcer predictions. Sensitivity of prediction is defined as the proportion of the true positive outcomes that are predicted to be positive. Specificity of prediction is defined as the proportion of the true negative outcomes that are correctly predicted to be negative. Using probability 0.3 as a threshold to infer ulcer occurrence at the prediction stage, the combination of PWS and PWSn provided the best predictive accuracy with (sensitivity, specificity) = (0.97, 0.958). Sensitivity and specificity given by PWS, PWSn, and FSS individually were (0.788, 0.968), (0.515, 0.968), and (0.758, 0.928), respectively. The proposed computational-statistical process provides a novel method and a framework to assess the sensitivity and specificity of various risk indicators and offers the potential to identify the optimized predictor for plaque rupture using serial MRI with follow-up scan showing ulceration as the gold standard for method validation. While serial MRI data with actual rupture are hard to acquire, this single-case study suggests that combination of multiple predictors may provide potential improvement to existing plaque assessment schemes. With large-scale patient studies, this predictive modeling process may provide more solid ground for rupture predictor selection strategies and methods for image-based plaque vulnerability assessment.

  2. A Revelation: Quantum-Statistics and Classical-Statistics are Analytic-Geometry Conic-Sections and Numbers/Functions: Euler, Riemann, Bernoulli Generating-Functions: Conics to Numbers/Functions Deep Subtle Connections

    NASA Astrophysics Data System (ADS)

    Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!

  3. Order statistics inference for describing topological coupling and mechanical symmetry breaking in multidomain proteins

    NASA Astrophysics Data System (ADS)

    Kononova, Olga; Jones, Lee; Barsegov, V.

    2013-09-01

    Cooperativity is a hallmark of proteins, many of which show a modular architecture comprising discrete structural domains. Detecting and describing dynamic couplings between structural regions is difficult in view of the many-body nature of protein-protein interactions. By utilizing the GPU-based computational acceleration, we carried out simulations of the protein forced unfolding for the dimer WW - WW of the all-β-sheet WW domains used as a model multidomain protein. We found that while the physically non-interacting identical protein domains (WW) show nearly symmetric mechanical properties at low tension, reflected, e.g., in the similarity of their distributions of unfolding times, these properties become distinctly different when tension is increased. Moreover, the uncorrelated unfolding transitions at a low pulling force become increasingly more correlated (dependent) at higher forces. Hence, the applied force not only breaks "the mechanical symmetry" but also couples the physically non-interacting protein domains forming a multi-domain protein. We call this effect "the topological coupling." We developed a new theory, inspired by order statistics, to characterize protein-protein interactions in multi-domain proteins. The method utilizes the squared-Gaussian model, but it can also be used in conjunction with other parametric models for the distribution of unfolding times. The formalism can be taken to the single-molecule experimental lab to probe mechanical cooperativity and domain communication in multi-domain proteins.

  4. Cortical mechanisms for the segregation and representation of acoustic textures.

    PubMed

    Overath, Tobias; Kumar, Sukhbinder; Stewart, Lauren; von Kriegstein, Katharina; Cusack, Rhodri; Rees, Adrian; Griffiths, Timothy D

    2010-02-10

    Auditory object analysis requires two fundamental perceptual processes: the definition of the boundaries between objects, and the abstraction and maintenance of an object's characteristic features. Although it is intuitive to assume that the detection of the discontinuities at an object's boundaries precedes the subsequent precise representation of the object, the specific underlying cortical mechanisms for segregating and representing auditory objects within the auditory scene are unknown. We investigated the cortical bases of these two processes for one type of auditory object, an "acoustic texture," composed of multiple frequency-modulated ramps. In these stimuli, we independently manipulated the statistical rules governing (1) the frequency-time space within individual textures (comprising ramps with a given spectrotemporal coherence) and (2) the boundaries between textures (adjacent textures with different spectrotemporal coherences). Using functional magnetic resonance imaging, we show mechanisms defining boundaries between textures with different coherences in primary and association auditory cortices, whereas texture coherence is represented only in association cortex. Furthermore, participants' superior detection of boundaries across which texture coherence increased (as opposed to decreased) was reflected in a greater neural response in auditory association cortex at these boundaries. The results suggest a hierarchical mechanism for processing acoustic textures that is relevant to auditory object analysis: boundaries between objects are first detected as a change in statistical rules over frequency-time space, before a representation that corresponds to the characteristics of the perceived object is formed.

  5. Neuroimaging in epilepsy.

    PubMed

    Sidhu, Meneka Kaur; Duncan, John S; Sander, Josemir W

    2018-05-17

    Epilepsy neuroimaging is important for detecting the seizure onset zone, predicting and preventing deficits from surgery and illuminating mechanisms of epileptogenesis. An aspiration is to integrate imaging and genetic biomarkers to enable personalized epilepsy treatments. The ability to detect lesions, particularly focal cortical dysplasia and hippocampal sclerosis, is increased using ultra high-field imaging and postprocessing techniques such as automated volumetry, T2 relaxometry, voxel-based morphometry and surface-based techniques. Statistical analysis of PET and single photon emission computer tomography (STATISCOM) are superior to qualitative analysis alone in identifying focal abnormalities in MRI-negative patients. These methods have also been used to study mechanisms of epileptogenesis and pharmacoresistance.Recent language fMRI studies aim to localize, and also lateralize language functions. Memory fMRI has been recommended to lateralize mnemonic function and predict outcome after surgery in temporal lobe epilepsy. Combinations of structural, functional and post-processing methods have been used in multimodal and machine learning models to improve the identification of the seizure onset zone and increase understanding of mechanisms underlying structural and functional aberrations in epilepsy.

  6. Prediction of the dollar to the ruble rate. A system-theoretic approach

    NASA Astrophysics Data System (ADS)

    Borodachev, Sergey M.

    2017-07-01

    Proposed a simple state-space model of dollar rate formation based on changes in oil prices and some mechanisms of money transfer between monetary and stock markets. Comparison of predictions by means of input-output model and state-space model is made. It concludes that with proper use of statistical data (Kalman filter) the second approach provides more adequate predictions of the dollar rate.

  7. Response of noctilucent cloud brightness to daily solar variations

    NASA Astrophysics Data System (ADS)

    Dalin, P.; Pertsev, N.; Perminov, V.; Dubietis, A.; Zadorozhny, A.; Zalcik, M.; McEachran, I.; McEwan, T.; Černis, K.; Grønne, J.; Taustrup, T.; Hansen, O.; Andersen, H.; Melnikov, D.; Manevich, A.; Romejko, V.; Lifatova, D.

    2018-04-01

    For the first time, long-term data sets of ground-based observations of noctilucent clouds (NLC) around the globe have been analyzed in order to investigate a response of NLC to solar UV irradiance variability on a day-to-day scale. NLC brightness has been considered versus variations of solar Lyman-alpha flux. We have found that day-to-day solar variability, whose effect is generally masked in the natural NLC variability, has a statistically significant effect when considering large statistics for more than ten years. Average increase in day-to-day solar Lyman-α flux results in average decrease in day-to-day NLC brightness that can be explained by robust physical mechanisms taking place in the summer mesosphere. Average time lags between variations of Lyman-α flux and NLC brightness are short (0-3 days), suggesting a dominant role of direct solar heating and of the dynamical mechanism compared to photodissociation of water vapor by solar Lyman-α flux. All found regularities are consistent between various ground-based NLC data sets collected at different locations around the globe and for various time intervals. Signatures of a 27-day periodicity seem to be present in the NLC brightness for individual summertime intervals; however, this oscillation cannot be unambiguously retrieved due to inevitable periods of tropospheric cloudiness.

  8. Learning-dependent plasticity with and without training in the human brain.

    PubMed

    Zhang, Jiaxiang; Kourtzi, Zoe

    2010-07-27

    Long-term experience through development and evolution and shorter-term training in adulthood have both been suggested to contribute to the optimization of visual functions that mediate our ability to interpret complex scenes. However, the brain plasticity mechanisms that mediate the detection of objects in cluttered scenes remain largely unknown. Here, we combine behavioral and functional MRI (fMRI) measurements to investigate the human-brain mechanisms that mediate our ability to learn statistical regularities and detect targets in clutter. We show two different routes to visual learning in clutter with discrete brain plasticity signatures. Specifically, opportunistic learning of regularities typical in natural contours (i.e., collinearity) can occur simply through frequent exposure, generalize across untrained stimulus features, and shape processing in occipitotemporal regions implicated in the representation of global forms. In contrast, learning to integrate discontinuities (i.e., elements orthogonal to contour paths) requires task-specific training (bootstrap-based learning), is stimulus-dependent, and enhances processing in intraparietal regions implicated in attention-gated learning. We propose that long-term experience with statistical regularities may facilitate opportunistic learning of collinear contours, whereas learning to integrate discontinuities entails bootstrap-based training for the detection of contours in clutter. These findings provide insights in understanding how long-term experience and short-term training interact to shape the optimization of visual recognition processes.

  9. Evaluation of high-resolution sea ice models on the basis of statistical and scaling properties of Arctic sea ice drift and deformation

    NASA Astrophysics Data System (ADS)

    Girard, L.; Weiss, J.; Molines, J. M.; Barnier, B.; Bouillon, S.

    2009-08-01

    Sea ice drift and deformation from models are evaluated on the basis of statistical and scaling properties. These properties are derived from two observation data sets: the RADARSAT Geophysical Processor System (RGPS) and buoy trajectories from the International Arctic Buoy Program (IABP). Two simulations obtained with the Louvain-la-Neuve Ice Model (LIM) coupled to a high-resolution ocean model and a simulation obtained with the Los Alamos Sea Ice Model (CICE) were analyzed. Model ice drift compares well with observations in terms of large-scale velocity field and distributions of velocity fluctuations although a significant bias on the mean ice speed is noted. On the other hand, the statistical properties of ice deformation are not well simulated by the models: (1) The distributions of strain rates are incorrect: RGPS distributions of strain rates are power law tailed, i.e., exhibit "wild randomness," whereas models distributions remain in the Gaussian attraction basin, i.e., exhibit "mild randomness." (2) The models are unable to reproduce the spatial and temporal correlations of the deformation fields: In the observations, ice deformation follows spatial and temporal scaling laws that express the heterogeneity and the intermittency of deformation. These relations do not appear in simulated ice deformation. Mean deformation in models is almost scale independent. The statistical properties of ice deformation are a signature of the ice mechanical behavior. The present work therefore suggests that the mechanical framework currently used by models is inappropriate. A different modeling framework based on elastic interactions could improve the representation of the statistical and scaling properties of ice deformation.

  10. Statistical learning using real-world scenes: extracting categorical regularities without conscious intent.

    PubMed

    Brady, Timothy F; Oliva, Aude

    2008-07-01

    Recent work has shown that observers can parse streams of syllables, tones, or visual shapes and learn statistical regularities in them without conscious intent (e.g., learn that A is always followed by B). Here, we demonstrate that these statistical-learning mechanisms can operate at an abstract, conceptual level. In Experiments 1 and 2, observers incidentally learned which semantic categories of natural scenes covaried (e.g., kitchen scenes were always followed by forest scenes). In Experiments 3 and 4, category learning with images of scenes transferred to words that represented the categories. In each experiment, the category of the scenes was irrelevant to the task. Together, these results suggest that statistical-learning mechanisms can operate at a categorical level, enabling generalization of learned regularities using existing conceptual knowledge. Such mechanisms may guide learning in domains as disparate as the acquisition of causal knowledge and the development of cognitive maps from environmental exploration.

  11. Implicit Statistical Learning and Language Skills in Bilingual Children

    ERIC Educational Resources Information Center

    Yim, Dongsun; Rudoy, John

    2013-01-01

    Purpose: Implicit statistical learning in 2 nonlinguistic domains (visual and auditory) was used to investigate (a) whether linguistic experience influences the underlying learning mechanism and (b) whether there are modality constraints in predicting implicit statistical learning with age and language skills. Method: Implicit statistical learning…

  12. Reverse engineering systems models of regulation: discovery, prediction and mechanisms.

    PubMed

    Ashworth, Justin; Wurtmann, Elisabeth J; Baliga, Nitin S

    2012-08-01

    Biological systems can now be understood in comprehensive and quantitative detail using systems biology approaches. Putative genome-scale models can be built rapidly based upon biological inventories and strategic system-wide molecular measurements. Current models combine statistical associations, causative abstractions, and known molecular mechanisms to explain and predict quantitative and complex phenotypes. This top-down 'reverse engineering' approach generates useful organism-scale models despite noise and incompleteness in data and knowledge. Here we review and discuss the reverse engineering of biological systems using top-down data-driven approaches, in order to improve discovery, hypothesis generation, and the inference of biological properties. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Infinite-mode squeezed coherent states and non-equilibrium statistical mechanics (phase-space-picture approach)

    NASA Technical Reports Server (NTRS)

    Yeh, Leehwa

    1993-01-01

    The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite-mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena.

  14. Hydrologic controls on aperiodic spatial organization of the ridge-slough patterned landscape

    NASA Astrophysics Data System (ADS)

    Casey, Stephen T.; Cohen, Matthew J.; Acharya, Subodh; Kaplan, David A.; Jawitz, James W.

    2016-11-01

    A century of hydrologic modification has altered the physical and biological drivers of landscape processes in the Everglades (Florida, USA). Restoring the ridge-slough patterned landscape, a dominant feature of the historical system, is a priority but requires an understanding of pattern genesis and degradation mechanisms. Physical experiments to evaluate alternative pattern formation mechanisms are limited by the long timescales of peat accumulation and loss, necessitating model-based comparisons, where support for a particular mechanism is based on model replication of extant patterning and trajectories of degradation. However, multiple mechanisms yield a central feature of ridge-slough patterning (patch elongation in the direction of historical flow), limiting the utility of that characteristic for discriminating among alternatives. Using data from vegetation maps, we investigated the statistical features of ridge-slough spatial patterning (ridge density, patch perimeter, elongation, patch size distributions, and spatial periodicity) to establish more rigorous criteria for evaluating model performance and to inform controls on pattern variation across the contemporary system. Mean water depth explained significant variation in ridge density, total perimeter, and length : width ratios, illustrating an important pattern response to existing hydrologic gradients. Two independent analyses (2-D periodograms and patch size distributions) provide strong evidence against regular patterning, with the landscape exhibiting neither a characteristic wavelength nor a characteristic patch size, both of which are expected under conditions that produce regular patterns. Rather, landscape properties suggest robust scale-free patterning, indicating genesis from the coupled effects of local facilitation and a global negative feedback operating uniformly at the landscape scale. Critically, this challenges widespread invocation of scale-dependent negative feedbacks for explaining ridge-slough pattern origins. These results help discern among genesis mechanisms and provide an improved statistical description of the landscape that can be used to compare among model outputs, as well as to assess the success of future restoration projects.

  15. Quantum Mechanics From the Cradle?

    ERIC Educational Resources Information Center

    Martin, John L.

    1974-01-01

    States that the major problem in learning quantum mechanics is often the student's ignorance of classical mechanics and that one conceptual hurdle in quantum mechanics is its statistical nature, in contrast to the determinism of classical mechanics. (MLH)

  16. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  17. Impact of Injury Mechanisms on Patterns and Management of Facial Fractures.

    PubMed

    Greathouse, S Travis; Adkinson, Joshua M; Garza, Ramon; Gilstrap, Jarom; Miller, Nathan F; Eid, Sherrine M; Murphy, Robert X

    2015-07-01

    Mechanisms causing facial fractures have evolved over time and may be predictive of the types of injuries sustained. The objective of this study is to examine the impact of mechanisms of injury on the type and management of facial fractures at our Level 1 Trauma Center. The authors performed an Institutional Review Board-approved review of our network's trauma registry from 2006 to 2010, documenting age, sex, mechanism, Injury Severity Score, Glasgow Coma Scale, facial fracture patterns (nasal, maxillary/malar, orbital, mandible), and reconstructions. Mechanism rates were compared using a Pearson χ2 test. The database identified 23,318 patients, including 1686 patients with facial fractures and a subset of 1505 patients sustaining 2094 fractures by motor vehicle collision (MVC), fall, or assault. Nasal fractures were the most common injuries sustained by all mechanisms. MVCs were most likely to cause nasal and malar/maxillary fractures (P < 0.01). Falls were the least likely and assaults the most likely to cause mandible fractures (P < 0.001), the most common injury leading to surgical intervention (P < 0.001). Although not statistically significant, fractures sustained in MVCs were the most likely overall to undergo surgical intervention. Age, number of fractures, and alcohol level were statistically significant variables associated with operative management. Age and number of fractures sustained were associated with operative intervention. Although there is a statistically significant correlation between mechanism of injury and type of facial fracture sustained, none of the mechanisms evaluated herein are statistically associated with surgical intervention. Clinical Question/Level of Evidence: Therapeutic, III.

  18. Retrocausal Effects As A Consequence of Orthodox Quantum Mechanics Refined To Accommodate The Principle Of Sufficient Reason

    NASA Astrophysics Data System (ADS)

    Stapp, Henry P.

    2011-11-01

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.

  19. A statistical model for Windstorm Variability over the British Isles based on Large-scale Atmospheric and Oceanic Mechanisms

    NASA Astrophysics Data System (ADS)

    Kirchner-Bossi, Nicolas; Befort, Daniel J.; Wild, Simon B.; Ulbrich, Uwe; Leckebusch, Gregor C.

    2016-04-01

    Time-clustered winter storms are responsible for a majority of the wind-induced losses in Europe. Over last years, different atmospheric and oceanic large-scale mechanisms as the North Atlantic Oscillation (NAO) or the Meridional Overturning Circulation (MOC) have been proven to drive some significant portion of the windstorm variability over Europe. In this work we systematically investigate the influence of different large-scale natural variability modes: more than 20 indices related to those mechanisms with proven or potential influence on the windstorm frequency variability over Europe - mostly SST- or pressure-based - are derived by means of ECMWF ERA-20C reanalysis during the last century (1902-2009), and compared to the windstorm variability for the European winter (DJF). Windstorms are defined and tracked as in Leckebusch et al. (2008). The derived indices are then employed to develop a statistical procedure including a stepwise Multiple Linear Regression (MLR) and an Artificial Neural Network (ANN), aiming to hindcast the inter-annual (DJF) regional windstorm frequency variability in a case study for the British Isles. This case study reveals 13 indices with a statistically significant coupling with seasonal windstorm counts. The Scandinavian Pattern (SCA) showed the strongest correlation (0.61), followed by the NAO (0.48) and the Polar/Eurasia Pattern (0.46). The obtained indices (standard-normalised) are selected as predictors for a windstorm variability hindcast model applied for the British Isles. First, a stepwise linear regression is performed, to identify which mechanisms can explain windstorm variability best. Finally, the indices retained by the stepwise regression are used to develop a multlayer perceptron-based ANN that hindcasted seasonal windstorm frequency and clustering. Eight indices (SCA, NAO, EA, PDO, W.NAtl.SST, AMO (unsmoothed), EA/WR and Trop.N.Atl SST) are retained by the stepwise regression. Among them, SCA showed the highest linear coefficient, followed by SST in western Atlantic, AMO and NAO. The explanatory regression model (considering all time steps) provided a Coefficient of Determination (R^2) of 0.75. A predictive version of the linear model applying a leave-one-out cross-validation (LOOCV) shows an R2 of 0.56 and a relative RMSE of 4.67 counts/season. An ANN-based nonlinear hindcast model for the seasonal windstorm frequency is developed with the aim to improve the stepwise hindcast ability and thus better predict a time-clustered season over the case study. A 7 node-hidden layer perceptron is set, and the LOOCV procedure reveals a R2 of 0.71. In comparison to the stepwise MLR the RMSE is reduced a 20%. This work shows that for the British Isles case study, most of the interannual variability can be explained by certain large-scale mechanisms, considering also nonlinear effects (ANN). This allows to discern a time-clustered season from a non-clustered one - a key issue for applications e.g., in the (re)insurance industry.

  20. Tsallis statistics and neurodegenerative disorders

    NASA Astrophysics Data System (ADS)

    Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.

    2016-08-01

    In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.

  1. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  2. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE PAGES

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...

    2016-03-09

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  3. Turking Statistics: Student-Generated Surveys Increase Student Engagement and Performance

    ERIC Educational Resources Information Center

    Whitley, Cameron T.; Dietz, Thomas

    2018-01-01

    Thirty years ago, Hubert M. Blalock Jr. published an article in "Teaching Sociology" about the importance of teaching statistics. We honor Blalock's legacy by assessing how using Amazon Mechanical Turk (MTurk) in statistics classes can enhance student learning and increase statistical literacy among social science gradaute students. In…

  4. Determining Functional Reliability of Pyrotechnic Mechanical Devices

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Multhaup, Herbert A.

    1997-01-01

    This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.

  5. An adaptive optimal control for smart structures based on the subspace tracking identification technique

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Resta, Ferruccio; Borroni, Massimo; Cazzulani, Gabriele

    2014-04-01

    A new method for the real-time identification of mechanical system modal parameters is used in order to design different adaptive control logics aiming to reduce the vibrations in a carbon fiber plate smart structure. It is instrumented with three piezoelectric actuators, three accelerometers and three strain gauges. The real-time identification is based on a recursive subspace tracking algorithm whose outputs are elaborated by an ARMA model. A statistical approach is finally applied to choose the modal parameter correct values. These are given in input to model-based control logics such as a gain scheduling and an adaptive LQR control.

  6. Detwinning through migration of twin boundaries in nanotwinned Cu films under in situ ion irradiation

    PubMed Central

    Du, Jinlong; Wu, Zaoming; Fu, Engang; Liang, Yanxiang; Wang, Xingjun; Wang, Peipei; Yu, Kaiyuan; Ding, Xiangdong; Li, Meimei; Kirk, Marquis

    2018-01-01

    Abstract The mechanism of radiation-induced detwinning is different from that of deformation detwinning as the former is dominated by supersaturated radiation-induced defects while the latter is usually triggered by global stress. In situ Kr ion irradiation was performed to study the detwinning mechanism of nanotwinned Cu films with various twin thicknesses. Two types of incoherent twin boundaries (ITBs), so-called fixed ITBs and free ITBs, are characterized based on their structural features, and the difference in their migration behavior is investigated. It is observed that detwinning during radiation is attributed to the frequent migration of free ITBs, while the migration of fixed ITBs is absent. Statistics shows that the migration distance of free ITBs is thickness and dose dependent. Potential migration mechanisms are discussed. PMID:29535796

  7. Detwinning through migration of twin boundaries in nanotwinned Cu films under in situ ion irradiation.

    PubMed

    Du, Jinlong; Wu, Zaoming; Fu, Engang; Liang, Yanxiang; Wang, Xingjun; Wang, Peipei; Yu, Kaiyuan; Ding, Xiangdong; Li, Meimei; Kirk, Marquis

    2018-01-01

    The mechanism of radiation-induced detwinning is different from that of deformation detwinning as the former is dominated by supersaturated radiation-induced defects while the latter is usually triggered by global stress. In situ Kr ion irradiation was performed to study the detwinning mechanism of nanotwinned Cu films with various twin thicknesses. Two types of incoherent twin boundaries (ITBs), so-called fixed ITBs and free ITBs, are characterized based on their structural features, and the difference in their migration behavior is investigated. It is observed that detwinning during radiation is attributed to the frequent migration of free ITBs, while the migration of fixed ITBs is absent. Statistics shows that the migration distance of free ITBs is thickness and dose dependent. Potential migration mechanisms are discussed.

  8. A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part I-Theory

    NASA Astrophysics Data System (ADS)

    Tengattini, Alessandro; Das, Arghya; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

    2014-10-01

    This is the first of two papers introducing a novel thermomechanical continuum constitutive model for cemented granular materials. Here, we establish the theoretical foundations of the model, and highlight its novelties. At the limit of no cement, the model is fully consistent with the original Breakage Mechanics model. An essential ingredient of the model is the use of measurable and micro-mechanics based internal variables, describing the evolution of the dominant inelastic processes. This imposes a link between the macroscopic mechanical behavior and the statistically averaged evolution of the microstructure. As a consequence this model requires only a few physically identifiable parameters, including those of the original breakage model and new ones describing the cement: its volume fraction, its critical damage energy and bulk stiffness, and the cohesion.

  9. Does children's energy intake at one meal influence their intake at subsequent meals? Or do we just think it does?

    PubMed

    Hanley, James A; Hutcheon, Jennifer A

    2010-05-01

    It is widely believed that young children are able to adjust their energy intake across successive meals to compensate for higher or lower intakes at a given meal. This conclusion is based on past observations that although children's intake at individual meals is highly variable, total daily intakes are relatively constant. We investigated how much of this reduction in variability could be explained by the statistical phenomenon of the variability of individual components (each meal) always being relatively larger than the variability of their sum (total daily intake), independent of any physiological compensatory mechanism. We calculated, theoretically and by simulation, how variable a child's daily intake would be if there was no correlation between intakes at individual meals. We simulated groups of children with meal/snack intakes and variability in meal/snack intakes based on previously published values. Most importantly, we assumed that there was no correlation between intakes on successive meals. In both approaches, the coefficient of variation of the daily intakes was roughly 15%, considerably less than the 34% for individual meals. Thus, most of the reduction in variability found in past studies was explained without positing strong 'compensation'. Although children's daily energy intakes are indeed considerably less variable than their individual components, this phenomenon was observed even when intakes at each meal were simulated to be totally independent. We conclude that the commonly held belief that young children have a strong physiological compensatory mechanism to adjust intake at one meal based on intake at prior meals is likely to be based on flawed statistical reasoning.

  10. Reward-Guided Learning with and without Causal Attribution

    PubMed Central

    Jocham, Gerhard; Brodersen, Kay H.; Constantinescu, Alexandra O.; Kahn, Martin C.; Ianni, Angela M.; Walton, Mark E.; Rushworth, Matthew F.S.; Behrens, Timothy E.J.

    2016-01-01

    Summary When an organism receives a reward, it is crucial to know which of many candidate actions caused this reward. However, recent work suggests that learning is possible even when this most fundamental assumption is not met. We used novel reward-guided learning paradigms in two fMRI studies to show that humans deploy separable learning mechanisms that operate in parallel. While behavior was dominated by precise contingent learning, it also revealed hallmarks of noncontingent learning strategies. These learning mechanisms were separable behaviorally and neurally. Lateral orbitofrontal cortex supported contingent learning and reflected contingencies between outcomes and their causal choices. Amygdala responses around reward times related to statistical patterns of learning. Time-based heuristic mechanisms were related to activity in sensorimotor corticostriatal circuitry. Our data point to the existence of several learning mechanisms in the human brain, of which only one relies on applying known rules about the causal structure of the task. PMID:26971947

  11. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    NASA Astrophysics Data System (ADS)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  12. Autonomous Congestion Control in Delay-Tolerant Networks

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.; Jennings, Esther H.

    2005-01-01

    Congestion control is an important feature that directly affects network performance. Network congestion may cause loss of data or long delays. Although this problem has been studied extensively in the Internet, the solutions for Internet congestion control do not apply readily to challenged network environments such as Delay Tolerant Networks (DTN) where end-to-end connectivity may not exist continuously and latency can be high. In DTN, end-to-end rate control is not feasible. This calls for congestion control mechanisms where the decisions can be made autonomously with local information only. We use an economic pricing model and propose a rule-based congestion control mechanism where each router can autonomously decide on whether to accept a bundle (data) based on local information such as available storage and the value and risk of accepting the bundle (derived from historical statistics).

  13. Physics of base-pairing dynamics in DNA

    NASA Astrophysics Data System (ADS)

    Manghi, Manoel; Destainville, Nicolas

    2016-05-01

    As a key molecule of life, Deoxyribo-Nucleic Acid (DNA) is the focus of numbers of investigations with the help of biological, chemical and physical techniques. From a physical point of view, both experimental and theoretical works have brought quantitative insights into DNA base-pairing dynamics that we review in this Report, putting emphasis on theoretical developments. We discuss the dynamics at the base-pair scale and its pivotal coupling with the polymer one, with a polymerization index running from a few nucleotides to tens of kilo-bases. This includes opening and closure of short hairpins and oligomers as well as zipping and unwinding of long macromolecules. We review how different physical mechanisms are either used by Nature or utilized in biotechnological processes to separate the two intertwined DNA strands, by insisting on quantitative results. They go from thermally-assisted denaturation bubble nucleation to force- or torque-driven mechanisms. We show that the helical character of the molecule, possibly supercoiled, can play a key role in many denaturation and renaturation processes. We categorize the mechanisms according to the relative timescales associated with base-pairing and chain orientational degrees of freedom such as bending and torsional elastic ones. In some specific situations, these chain orientational degrees of freedom can be integrated out, and the quasi-static approximation is valid. The complex dynamics then reduces to the diffusion in a low-dimensional free-energy landscape. In contrast, some important cases of experimental interest necessarily appeal to far-from-equilibrium statistical mechanics and hydrodynamics.

  14. Repairability of CAD/CAM high-density PMMA- and composite-based polymers.

    PubMed

    Wiegand, Annette; Stucki, Lukas; Hoffmann, Robin; Attin, Thomas; Stawarczyk, Bogna

    2015-11-01

    The study aimed to analyse the shear bond strength of computer-aided design and computer-aided manufacturing (CAD/CAM) polymethyl methacrylate (PMMA)- and composite-based polymer materials repaired with a conventional methacrylate-based composite after different surface pretreatments. Each 48 specimens was prepared from six different CAD/CAM polymer materials (Ambarino high-class, artBloc Temp, CAD-Temp, Lava Ultimate, Telio CAD, Everest C-Temp) and a conventional dimethacrylate-based composite (Filtek Supreme XTE, control) and aged by thermal cycling (5000 cycles, 5-55 °C). The surfaces were left untreated or were pretreated by mechanical roughening, aluminium oxide air abrasion or silica coating/silanization (each subgroup n = 12). The surfaces were further conditioned with an etch&rinse adhesive (OptiBond FL) before the repair composite (Filtek Supreme XTE) was adhered to the surface. After further thermal cycling, shear bond strength was tested, and failure modes were assessed. Shear bond strength was statistically analysed by two- and one-way ANOVAs and Weibull statistics, failure mode by chi(2) test (p ≤ 0.05). Shear bond strength was highest for silica coating/silanization > aluminium oxide air abrasion = mechanical roughening > no surface pretreatment. Independently of the repair pretreatment, highest bond strength values were observed in the control group and for the composite-based Everest C-Temp and Ambarino high-class, while PMMA-based materials (artBloc Temp, CAD-Temp and Telio CAD) presented significantly lowest values. For all materials, repair without any surface pretreatment resulted in adhesive failures only, which mostly were reduced when surface pretreatment was performed. Repair of CAD/CAM high-density polymers requires surface pretreatment prior to adhesive and composite application. However, four out of six of the tested CAD/CAM materials did not achieve the repair bond strength of a conventional dimethacrylate-based composite. Repair of PMMA- and composite-based polymers can be achieved by surface pretreatment followed by application of an adhesive and a conventional methacrylate-based composite.

  15. Molecular representation of molar domain (volume), evolution equations, and linear constitutive relations for volume transport.

    PubMed

    Eu, Byung Chan

    2008-09-07

    In the traditional theories of irreversible thermodynamics and fluid mechanics, the specific volume and molar volume have been interchangeably used for pure fluids, but in this work we show that they should be distinguished from each other and given distinctive statistical mechanical representations. In this paper, we present a general formula for the statistical mechanical representation of molecular domain (volume or space) by using the Voronoi volume and its mean value that may be regarded as molar domain (volume) and also the statistical mechanical representation of volume flux. By using their statistical mechanical formulas, the evolution equations of volume transport are derived from the generalized Boltzmann equation of fluids. Approximate solutions of the evolution equations of volume transport provides kinetic theory formulas for the molecular domain, the constitutive equations for molar domain (volume) and volume flux, and the dissipation of energy associated with volume transport. Together with the constitutive equation for the mean velocity of the fluid obtained in a previous paper, the evolution equations for volume transport not only shed a fresh light on, and insight into, irreversible phenomena in fluids but also can be applied to study fluid flow problems in a manner hitherto unavailable in fluid dynamics and irreversible thermodynamics. Their roles in the generalized hydrodynamics will be considered in the sequel.

  16. Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koprinkov, I. G.

    2010-11-25

    The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.

  17. Non-equilibrium dog-flea model

    NASA Astrophysics Data System (ADS)

    Ackerson, Bruce J.

    2017-11-01

    We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.

  18. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems

    NASA Astrophysics Data System (ADS)

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  19. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.

    PubMed

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  20. Domain generality vs. modality specificity: The paradox of statistical learning

    PubMed Central

    Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.

    2015-01-01

    Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249

  1. Songs as an aid for language acquisition.

    PubMed

    Schön, Daniele; Boyer, Maud; Moreno, Sylvain; Besson, Mireille; Peretz, Isabelle; Kolinsky, Régine

    2008-02-01

    In previous research, Saffran and colleagues [Saffran, J. R., Aslin, R. N., & Newport, E. L. (1996). Statistical learning by 8-month-old infants. Science, 274, 1926-1928; Saffran, J. R., Newport, E. L., & Aslin, R. N. (1996). Word segmentation: The role of distributional cues. Journal of Memory and Language, 35, 606-621.] have shown that adults and infants can use the statistical properties of syllable sequences to extract words from continuous speech. They also showed that a similar learning mechanism operates with musical stimuli [Saffran, J. R., Johnson, R. E. K., Aslin, N., & Newport, E. L. (1999). Abstract Statistical learning of tone sequences by human infants and adults. Cognition, 70, 27-52.]. In this work we combined linguistic and musical information and we compared language learning based on speech sequences to language learning based on sung sequences. We hypothesized that, compared to speech sequences, a consistent mapping of linguistic and musical information would enhance learning. Results confirmed the hypothesis showing a strong learning facilitation of song compared to speech. Most importantly, the present results show that learning a new language, especially in the first learning phase wherein one needs to segment new words, may largely benefit of the motivational and structuring properties of music in song.

  2. Fractal planetary rings: Energy inequalities and random field model

    NASA Astrophysics Data System (ADS)

    Malyarenko, Anatoliy; Ostoja-Starzewski, Martin

    2017-12-01

    This study is motivated by a recent observation, based on photographs from the Cassini mission, that Saturn’s rings have a fractal structure in radial direction. Accordingly, two questions are considered: (1) What Newtonian mechanics argument in support of such a fractal structure of planetary rings is possible? (2) What kinematics model of such fractal rings can be formulated? Both challenges are based on taking planetary rings’ spatial structure as being statistically stationary in time and statistically isotropic in space, but statistically nonstationary in space. An answer to the first challenge is given through an energy analysis of circular rings having a self-generated, noninteger-dimensional mass distribution [V. E. Tarasov, Int. J. Mod Phys. B 19, 4103 (2005)]. The second issue is approached by taking the random field of angular velocity vector of a rotating particle of the ring as a random section of a special vector bundle. Using the theory of group representations, we prove that such a field is completely determined by a sequence of continuous positive-definite matrix-valued functions defined on the Cartesian square F2 of the radial cross-section F of the rings, where F is a fat fractal.

  3. [Evaluation of occupational medicine service tasks in the context of the Occupational Medicine Service Act, article 12, on the basis of statistical indicators in the Pomorskie voivodship].

    PubMed

    Parszuto, Jacek; Jaremin, Bogdan; Tukalska-Parszuto, Maria

    2009-01-01

    Occupational health service is based on legal regulations. We have made an attempt to estimate the implementation of the tasks resulting from article 12 of the Occupational Medicine Service Act introduced in 1998. In this paper we analyzed statistical data concerning the number of prophylactic health contracts, economic entities and health insurance payers. The data come from the Nofer Institute of Occupational Medicine, Central Statistical Office and Social Insurance Institution. Contract Coverage Rate (CCR) has been introduced for the purpose of this research. The data show that in 2007, the Contract Coverage Rate (CCR) for the Pomorskie voivodeship (province) accounted for 45.7%, with the median value of 14.4% for all voivodeships in Poland. According to the gathered statistical data, it should be concluded that the implementation of article 12 is insufficient. The amendment to the Act introducing the provision on written contracts is an opportunity to provide an effective mechanism, by which the present situation can be improved and the rates raised to a satisfactory level.

  4. Feedback Controlled Colloidal Assembly at Fluid Interfaces

    NASA Astrophysics Data System (ADS)

    Bevan, Michael

    The autonomous and reversible assembly of colloidal nano- and micro- scale components into ordered configurations is often suggested as a scalable process capable of manufacturing meta-materials with exotic electromagnetic properties. As a result, there is strong interest in understanding how thermal motion, particle interactions, patterned surfaces, and external fields can be optimally coupled to robustly control the assembly of colloidal components into hierarchically structured functional meta-materials. We approach this problem by directly relating equilibrium and dynamic colloidal microstructures to kT-scale energy landscapes mediated by colloidal forces, physically and chemically patterned surfaces, multiphase fluid interfaces, and electromagnetic fields. 3D colloidal trajectories are measured in real-space and real-time with nanometer resolution using an integrated suite of evanescent wave, video, and confocal microscopy methods. Equilibrium structures are connected to energy landscapes via statistical mechanical models. The dynamic evolution of initially disordered colloidal fluid configurations into colloidal crystals in the presence of tunable interactions (electromagnetic field mediated interactions, particle-interface interactions) is modeled using a novel approach based on fitting the Fokker-Planck equation to experimental microscopy and computer simulated assembly trajectories. This approach is based on the use of reaction coordinates that capture important microstructural features of crystallization processes and quantify both statistical mechanical (free energy) and fluid mechanical (hydrodynamic) contributions. Ultimately, we demonstrate real-time control of assembly, disassembly, and repair of colloidal crystals using both open loop and closed loop control to produce perfectly ordered colloidal microstructures. This approach is demonstrated for close packed colloidal crystals of spherical particles at fluid-solid interfaces and is being extended to anisotropic particles and multiphase fluid interfaces.

  5. The influence of polishing techniques on pre-polymerized CAD\\CAM acrylic resin denture bases

    PubMed Central

    Alammari, Manal Rahma

    2017-01-01

    Background Lately, computer-aided design and computer-aided manufacturing (CAD/CAM) has broadly been successfully employed in dentistry. The CAD/CAM systems have recently become commercially available for fabrication of complete dentures, and are considered as an alternative technique to conventionally processed acrylic resin bases. However, they have not yet been fully investigated. Objective The purpose of this study was to inspect the effects of mechanical polishing and chemical polishing on the surface roughness (Ra) and contact angle (wettability) of heat-cured, auto-cured and CAD/CAM denture base acrylic resins. Methods This study was conducted at the Advanced Dental Research Laboratory Center of King Abdulaziz University from March to June 2017. Three denture base materials were selected: heat cure poly-methylmethacrylate resin, thermoplastic (polyamide resin) and (CAD\\CAM) denture base resin. Sixty specimens were prepared and divided into three groups, twenty in each. Each group was divided according to the polishing techniques into (Mech P) and (Chem P), ten specimens in each; surface roughness and wettability were investigated. Data were analyzed by SPSS version 22, using one-way ANOVA and Pearson coefficient. Results One-way analysis of variance (ANOVA) and post hoc tests were used for comparing the surface roughness values between three groups which revealed a statistical significant difference between them (p1<0.001). Heat-cured denture base material of (Group I) in both methods, showed the highest mean surface roughness value (2.44±0.07, 2.72±0.09, Mech P and Chem P respectively); while CAD\\CAM denture base material (group III) showed the least mean values (1.08±0.23, 1.39±0.31, Mech P and Chem P respectively). CAD/CAM showed the least contact angle in both polishing methods, which were statistically significant at 5% level (p=0.034 and p<0.001). Conclusion Mechanical polishing produced lower surface roughness of CAD\\CAM denture base resin with superior smooth surface compared to chemical polishing. Mechanical polishing is considered the best effective polishing technique. CAD/CAM denture base material should be considered as the material of choice for complete denture construction in the near future, especially for older dental patients with changed salivary functions, because of its wettability. PMID:29238483

  6. The influence of polishing techniques on pre-polymerized CAD\\CAM acrylic resin denture bases.

    PubMed

    Alammari, Manal Rahma

    2017-10-01

    Lately, computer-aided design and computer-aided manufacturing (CAD/CAM) has broadly been successfully employed in dentistry. The CAD/CAM systems have recently become commercially available for fabrication of complete dentures, and are considered as an alternative technique to conventionally processed acrylic resin bases. However, they have not yet been fully investigated. The purpose of this study was to inspect the effects of mechanical polishing and chemical polishing on the surface roughness (Ra) and contact angle (wettability) of heat-cured, auto-cured and CAD/CAM denture base acrylic resins. This study was conducted at the Advanced Dental Research Laboratory Center of King Abdulaziz University from March to June 2017. Three denture base materials were selected: heat cure poly-methylmethacrylate resin, thermoplastic (polyamide resin) and (CAD\\CAM) denture base resin. Sixty specimens were prepared and divided into three groups, twenty in each. Each group was divided according to the polishing techniques into (Mech P) and (Chem P), ten specimens in each; surface roughness and wettability were investigated. Data were analyzed by SPSS version 22, using one-way ANOVA and Pearson coefficient. One-way analysis of variance (ANOVA) and post hoc tests were used for comparing the surface roughness values between three groups which revealed a statistical significant difference between them (p 1 <0.001). Heat-cured denture base material of (Group I) in both methods, showed the highest mean surface roughness value (2.44±0.07, 2.72±0.09, Mech P and Chem P respectively); while CAD\\CAM denture base material (group III) showed the least mean values (1.08±0.23, 1.39±0.31, Mech P and Chem P respectively). CAD/CAM showed the least contact angle in both polishing methods, which were statistically significant at 5% level (p=0.034 and p<0.001). Mechanical polishing produced lower surface roughness of CAD\\CAM denture base resin with superior smooth surface compared to chemical polishing. Mechanical polishing is considered the best effective polishing technique. CAD/CAM denture base material should be considered as the material of choice for complete denture construction in the near future, especially for older dental patients with changed salivary functions, because of its wettability.

  7. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    PubMed

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  8. Potential Mediators in Parenting and Family Intervention: Quality of Mediation Analyses

    PubMed Central

    Patel, Chandni C.; Fairchild, Amanda J.; Prinz, Ronald J.

    2017-01-01

    Parenting and family interventions have repeatedly shown effectiveness in preventing and treating a range of youth outcomes. Accordingly, investigators in this area have conducted a number of studies using statistical mediation to examine some of the potential mechanisms of action by which these interventions work. This review examined from a methodological perspective in what ways and how well the family-based intervention studies tested statistical mediation. A systematic search identified 73 published outcome studies that tested mediation for family-based interventions across a wide range of child and adolescent outcomes (i.e., externalizing, internalizing, and substance-abuse problems; high-risk sexual activity; and academic achievement), for putative mediators pertaining to positive and negative parenting, family functioning, youth beliefs and coping skills, and peer relationships. Taken as a whole, the studies used designs that adequately addressed temporal precedence. The majority of studies used the product of coefficients approach to mediation, which is preferred, and less limiting than the causal steps approach. Statistical significance testing did not always make use of the most recently developed approaches, which would better accommodate small sample sizes and more complex functions. Specific recommendations are offered for future mediation studies in this area with respect to full longitudinal design, mediation approach, significance testing method, documentation and reporting of statistics, testing of multiple mediators, and control for Type I error. PMID:28028654

  9. Bayesian approach to inverse statistical mechanics.

    PubMed

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  10. Bayesian approach to inverse statistical mechanics

    NASA Astrophysics Data System (ADS)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  11. Renormalization Group Tutorial

    NASA Technical Reports Server (NTRS)

    Bell, Thomas L.

    2004-01-01

    Complex physical systems sometimes have statistical behavior characterized by power- law dependence on the parameters of the system and spatial variability with no particular characteristic scale as the parameters approach critical values. The renormalization group (RG) approach was developed in the fields of statistical mechanics and quantum field theory to derive quantitative predictions of such behavior in cases where conventional methods of analysis fail. Techniques based on these ideas have since been extended to treat problems in many different fields, and in particular, the behavior of turbulent fluids. This lecture will describe a relatively simple but nontrivial example of the RG approach applied to the diffusion of photons out of a stellar medium when the photons have wavelengths near that of an emission line of atoms in the medium.

  12. A statistical model of brittle fracture by transgranular cleavage

    NASA Astrophysics Data System (ADS)

    Lin, Tsann; Evans, A. G.; Ritchie, R. O.

    A MODEL for brittle fracture by transgranular cleavage cracking is presented based on the application of weakest link statistics to the critical microstructural fracture mechanisms. The model permits prediction of the macroscopic fracture toughness, KI c, in single phase microstructures containing a known distribution of particles, and defines the critical distance from the crack tip at which the initial cracking event is most probable. The model is developed for unstable fracture ahead of a sharp crack considering both linear elastic and nonlinear elastic ("elastic/plastic") crack tip stress fields. Predictions are evaluated by comparison with experimental results on the low temperature flow and fracture behavior of a low carbon mild steel with a simple ferrite/grain boundary carbide microstructure.

  13. Introduction of statistical information in a syntactic analyzer for document image recognition

    NASA Astrophysics Data System (ADS)

    Maroneze, André O.; Coüasnon, Bertrand; Lemaitre, Aurélie

    2011-01-01

    This paper presents an improvement to document layout analysis systems, offering a possible solution to Sayre's paradox (which states that an element "must be recognized before it can be segmented; and it must be segmented before it can be recognized"). This improvement, based on stochastic parsing, allows integration of statistical information, obtained from recognizers, during syntactic layout analysis. We present how this fusion of numeric and symbolic information in a feedback loop can be applied to syntactic methods to improve document description expressiveness. To limit combinatorial explosion during exploration of solutions, we devised an operator that allows optional activation of the stochastic parsing mechanism. Our evaluation on 1250 handwritten business letters shows this method allows the improvement of global recognition scores.

  14. The role of internal duplication in the evolution of multi-domain proteins.

    PubMed

    Nacher, J C; Hayashida, M; Akutsu, T

    2010-08-01

    Many proteins consist of several structural domains. These multi-domain proteins have likely been generated by selective genome growth dynamics during evolution to perform new functions as well as to create structures that fold on a biologically feasible time scale. Domain units frequently evolved through a variety of genetic shuffling mechanisms. Here we examine the protein domain statistics of more than 1000 organisms including eukaryotic, archaeal and bacterial species. The analysis extends earlier findings on asymmetric statistical laws for proteome to a wider variety of species. While proteins are composed of a wide range of domains, displaying a power-law decay, the computation of domain families for each protein reveals an exponential distribution, characterizing a protein universe composed of a thin number of unique families. Structural studies in proteomics have shown that domain repeats, or internal duplicated domains, represent a small but significant fraction of genome. In spite of its importance, this observation has been largely overlooked until recently. We model the evolutionary dynamics of proteome and demonstrate that these distinct distributions are in fact rooted in an internal duplication mechanism. This process generates the contemporary protein structural domain universe, determines its reduced thickness, and tames its growth. These findings have important implications, ranging from protein interaction network modeling to evolutionary studies based on fundamental mechanisms governing genome expansion.

  15. University of California Conference on Statistical Mechanics (4th) Held March 26-28, 1990

    DTIC Science & Technology

    1990-03-28

    and S. Lago, Chem. Phys., Z, 5750 (1983) Shear Viscosity Calculation via Equilibrium Molecular Dynamics: Einstenian vs. Green - Kubo Formalism by Adel A...through the application of the Green - Kubo approach. Although the theoretical equivalence between both formalisms was demonstrated by Helfand [3], their...like equations and of different expressions based on the Green - Kubo formalism. In contrast to Hoheisel and Vogelsang’s conclusions [2], we find that

  16. A Role for Chunk Formation in Statistical Learning of Second Language Syntax

    ERIC Educational Resources Information Center

    Hamrick, Phillip

    2014-01-01

    Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by…

  17. eLearning course may shorten the duration of mechanical restraint among psychiatric inpatients: a cluster-randomized trial.

    PubMed

    Kontio, Raija; Pitkänen, Anneli; Joffe, Grigori; Katajisto, Jouko; Välimäki, Maritta

    2014-10-01

    The management of psychiatric inpatients exhibiting severely disturbed and aggressive behaviour is an important educational topic. Well structured, IT-based educational programmes (eLearning) often ensure quality and may make training more affordable and accessible. The aim of this study was to explore the impact of an eLearning course for personnel on the rates and duration of seclusion and mechanical restraint among psychiatric inpatients. In a cluster-randomized intervention trial, the nursing personnel on 10 wards were randomly assigned to eLearning (intervention) or training-as-usual (control) groups. The eLearning course comprised six modules with specific topics (legal and ethical issues, behaviour-related factors, therapeutic relationship and self-awareness, teamwork and integrating knowledge with practice) and specific learning methods. The rates (incidents per 1000 occupied bed days) and durations of the coercion incidents were examined before and after the course. A total of 1283 coercion incidents (1143 seclusions [89%] and 140 incidents involving the use of mechanical restraints [11%]) were recorded on the study wards during the data collection period. On the intervention wards, there were no statistically significant changes in the rates of seclusion and mechanical restraint. However, the duration of incidents involving mechanical restraints shortened from 36.0 to 4.0 h (median) (P < 0.001). No statistically significant changes occurred on the control wards. After our eLearning course, the duration of incidents involving the use of mechanical restraints decreased. However, more studies are needed to ensure that the content of the course focuses on the most important factors associated with the seclusion-related elements. The eLearning course deserves further development and further studies. The duration of coercion incidents merits attention in future research.

  18. Development and Validation of a Statistical Shape Modeling-Based Finite Element Model of the Cervical Spine Under Low-Level Multiple Direction Loading Conditions

    PubMed Central

    Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.

    2014-01-01

    Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051

  19. Exploratory plasma proteomic analysis in a randomized crossover trial of aspirin among healthy men and women.

    PubMed

    Wang, Xiaoliang; Shojaie, Ali; Zhang, Yuzheng; Shelley, David; Lampe, Paul D; Levy, Lisa; Peters, Ulrike; Potter, John D; White, Emily; Lampe, Johanna W

    2017-01-01

    Long-term use of aspirin is associated with lower risk of colorectal cancer and other cancers; however, the mechanism of chemopreventive effect of aspirin is not fully understood. Animal studies suggest that COX-2, NFκB signaling and Wnt/β-catenin pathways may play a role, but no clinical trials have systematically evaluated the biological response to aspirin in healthy humans. Using a high-density antibody array, we assessed the difference in plasma protein levels after 60 days of regular dose aspirin (325 mg/day) compared to placebo in a randomized double-blinded crossover trial of 44 healthy non-smoking men and women, aged 21-45 years. The plasma proteome was analyzed on an antibody microarray with ~3,300 full-length antibodies, printed in triplicate. Moderated paired t-tests were performed on individual antibodies, and gene-set analyses were performed based on KEGG and GO pathways. Among the 3,000 antibodies analyzed, statistically significant differences in plasma protein levels were observed for nine antibodies after adjusting for false discoveries (FDR adjusted p-value<0.1). The most significant protein was succinate dehydrogenase subunit C (SDHC), a key enzyme complex of the mitochondrial tricarboxylic acid (TCA) cycle. The other statistically significant proteins (NR2F1, MSI1, MYH1, FOXO1, KHDRBS3, NFKBIE, LYZ and IKZF1) are involved in multiple pathways, including DNA base-pair repair, inflammation and oncogenic pathways. None of the 258 KEGG and 1,139 GO pathways was found to be statistically significant after FDR adjustment. This study suggests several chemopreventive mechanisms of aspirin in humans, which have previously been reported to play a role in anti- or pro-carcinogenesis in cell systems; however, larger, confirmatory studies are needed.

  20. Double-slit experiment with single wave-driven particles and its relation to quantum mechanics.

    PubMed

    Andersen, Anders; Madsen, Jacob; Reichelt, Christian; Rosenlund Ahl, Sonja; Lautrup, Benny; Ellegaard, Clive; Levinsen, Mogens T; Bohr, Tomas

    2015-07-01

    In a thought-provoking paper, Couder and Fort [Phys. Rev. Lett. 97, 154101 (2006)] describe a version of the famous double-slit experiment performed with droplets bouncing on a vertically vibrated fluid surface. In the experiment, an interference pattern in the single-particle statistics is found even though it is possible to determine unambiguously which slit the walking droplet passes. Here we argue, however, that the single-particle statistics in such an experiment will be fundamentally different from the single-particle statistics of quantum mechanics. Quantum mechanical interference takes place between different classical paths with precise amplitude and phase relations. In the double-slit experiment with walking droplets, these relations are lost since one of the paths is singled out by the droplet. To support our conclusions, we have carried out our own double-slit experiment, and our results, in particular the long and variable slit passage times of the droplets, cast strong doubt on the feasibility of the interference claimed by Couder and Fort. To understand theoretically the limitations of wave-driven particle systems as analogs to quantum mechanics, we introduce a Schrödinger equation with a source term originating from a localized particle that generates a wave while being simultaneously guided by it. We show that the ensuing particle-wave dynamics can capture some characteristics of quantum mechanics such as orbital quantization. However, the particle-wave dynamics can not reproduce quantum mechanics in general, and we show that the single-particle statistics for our model in a double-slit experiment with an additional splitter plate differs qualitatively from that of quantum mechanics.

  1. Coherent instability in wall-bounded turbulence

    NASA Astrophysics Data System (ADS)

    Hack, M. J. Philipp

    2017-11-01

    Hairpin vortices are commonly considered one of the major classes of coherent fluid motions in shear layers, even as their significance in the grand scheme of turbulence has remained an openly debated question. The statistical prevalence of the dynamic process that gives rise to the hairpins across different types of flows suggests an origin in a robust common mechanism triggered by conditions widespread in wall-bounded shear layers. This study seeks to shed light on the physical process which drives the generation of hairpin vortices. It is primarily facilitated through an algorithm based on concepts developed in the field of computer vision which allows the topological identification and analysis of coherent flow processes across multiple scales. Application to direct numerical simulations of boundary layers enables the time-resolved sampling and exploration of the hairpin process in natural flow. The analysis yields rich statistical results which lead to a refined characterization of the hairpin process. Linear stability theory offers further insight into the flow physics and especially into the connection between the hairpin and exponential amplification mechanisms. The results also provide a sharpened understanding of the underlying causality of events.

  2. Self-Organization: Complex Dynamical Systems in the Evolution of Speech

    NASA Astrophysics Data System (ADS)

    Oudeyer, Pierre-Yves

    Human vocalization systems are characterized by complex structural properties. They are combinatorial, based on the systematic reuse of phonemes, and the set of repertoires in human languages is characterized by both strong statistical regularities—universals—and a great diversity. Besides, they are conventional codes culturally shared in each community of speakers. What are the origins of the forms of speech? What are the mechanisms that permitted their evolution in the course of phylogenesis and cultural evolution? How can a shared speech code be formed in a community of individuals? This chapter focuses on the way the concept of self-organization, and its interaction with natural selection, can throw light on these three questions. In particular, a computational model is presented which shows that a basic neural equipment for adaptive holistic vocal imitation, coupling directly motor and perceptual representations in the brain, can generate spontaneously shared combinatorial systems of vocalizations in a society of babbling individuals. Furthermore, we show how morphological and physiological innate constraints can interact with these self-organized mechanisms to account for both the formation of statistical regularities and diversity in vocalization systems.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler; Shi, Ying; Santhanagopalan, Shriram

    Predictive models of Li-ion battery lifetime must consider a multiplicity of electrochemical, thermal, and mechanical degradation modes experienced by batteries in application environments. To complicate matters, Li-ion batteries can experience different degradation trajectories that depend on storage and cycling history of the application environment. Rates of degradation are controlled by factors such as temperature history, electrochemical operating window, and charge/discharge rate. We present a generalized battery life prognostic model framework for battery systems design and control. The model framework consists of trial functions that are statistically regressed to Li-ion cell life datasets wherein the cells have been aged under differentmore » levels of stress. Degradation mechanisms and rate laws dependent on temperature, storage, and cycling condition are regressed to the data, with multiple model hypotheses evaluated and the best model down-selected based on statistics. The resulting life prognostic model, implemented in state variable form, is extensible to arbitrary real-world scenarios. The model is applicable in real-time control algorithms to maximize battery life and performance. We discuss efforts to reduce lifetime prediction error and accommodate its inevitable impact in controller design.« less

  4. Generalized statistical mechanics of cosmic rays: Application to positron-electron spectral indices.

    PubMed

    Yalcin, G Cigdem; Beck, Christian

    2018-01-29

    Cosmic ray energy spectra exhibit power law distributions over many orders of magnitude that are very well described by the predictions of q-generalized statistical mechanics, based on a q-generalized Hagedorn theory for transverse momentum spectra and hard QCD scattering processes. QCD at largest center of mass energies predicts the entropic index to be [Formula: see text]. Here we show that the escort duality of the nonextensive thermodynamic formalism predicts an energy split of effective temperature given by Δ [Formula: see text] MeV, where T H is the Hagedorn temperature. We carefully analyse the measured data of the AMS-02 collaboration and provide evidence that the predicted temperature split is indeed observed, leading to a different energy dependence of the e + and e - spectral indices. We also observe a distinguished energy scale E *  ≈ 50 GeV where the e + and e - spectral indices differ the most. Linear combinations of the escort and non-escort q-generalized canonical distributions yield excellent agreement with the measured AMS-02 data in the entire energy range.

  5. Microwave-assisted synthesis and micellization behavior of soy-based copoly(2-oxazoline)s.

    PubMed

    Hoogenboom, Richard; Leenen, Mark A M; Huang, Haiying; Fustin, Charles-André; Gohy, Jean-François; Schubert, Ulrich S

    2006-01-01

    Polymers based on renewable resources are promising candidates for replacing common organic polymers, and thus, for reducing oil consumption. In this contribution we report the microwave-assisted synthesis of block and statistical copolymers from 2-ethyl-2-oxazoline and 2-"soy alkyl"-2-oxazoline via a cationic ring-opening polymerization mechanism. The synthesized copolymers were characterized by gel permeation chromatography and 1 H-NMR spectroscopy. The micellization of these amphiphilic copolymers was investigated by dynamic light scattering and atomic force microscopy to examine the effect of hydrophobic block length and monomer distribution on the resulting micellar characteristics.

  6. Strange non-chaotic attractors in a state controlled-cellular neural network-based quasiperiodically forced MLC circuit

    NASA Astrophysics Data System (ADS)

    Ezhilarasu, P. Megavarna; Inbavalli, M.; Murali, K.; Thamilmaran, K.

    2018-07-01

    In this paper, we report the dynamical transitions to strange non-chaotic attractors in a quasiperiodically forced state controlled-cellular neural network (SC-CNN)-based MLC circuit via two different mechanisms, namely the Heagy-Hammel route and the gradual fractalisation route. These transitions were observed through numerical simulations and hardware experiments and confirmed using statistical tools, such as maximal Lyapunov exponent spectrum and its variance and singular continuous spectral analysis. We find that there is a remarkable agreement of the results from both numerical simulations as well as from hardware experiments.

  7. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  8. Analysis of Longitudinal Outcome Data with Missing Values in Total Knee Arthroplasty.

    PubMed

    Kang, Yeon Gwi; Lee, Jang Taek; Kang, Jong Yeal; Kim, Ga Hye; Kim, Tae Kyun

    2016-01-01

    We sought to determine the influence of missing data on the statistical results, and to determine which statistical method is most appropriate for the analysis of longitudinal outcome data of TKA with missing values among repeated measures ANOVA, generalized estimating equation (GEE) and mixed effects model repeated measures (MMRM). Data sets with missing values were generated with different proportion of missing data, sample size and missing-data generation mechanism. Each data set was analyzed with three statistical methods. The influence of missing data was greater with higher proportion of missing data and smaller sample size. MMRM tended to show least changes in the statistics. When missing values were generated by 'missing not at random' mechanism, no statistical methods could fully avoid deviations in the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Entropy-based separation of yeast cells using a microfluidic system of conjoined spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Kai-Jian; Qin, S.-J., E-mail: shuijie.qin@gmail.com; Bai, Zhong-Chen

    2013-11-21

    A physical model is derived to create a biological cell separator that is based on controlling the entropy in a microfluidic system having conjoined spherical structures. A one-dimensional simplified model of this three-dimensional problem in terms of the corresponding effects of entropy on the Brownian motion of particles is presented. This dynamic mechanism is based on the Langevin equation from statistical thermodynamics and takes advantage of the characteristics of the Fokker-Planck equation. This mechanism can be applied to manipulate biological particles inside a microfluidic system with identical, conjoined, spherical compartments. This theoretical analysis is verified by performing a rapid andmore » a simple technique for separating yeast cells in these conjoined, spherical microfluidic structures. The experimental results basically match with our theoretical model and we further analyze the parameters which can be used to control this separation mechanism. Both numerical simulations and experimental results show that the motion of the particles depends on the geometrical boundary conditions of the microfluidic system and the initial concentration of the diffusing material. This theoretical model can be implemented in future biophysics devices for the optimized design of passive cell sorters.« less

  10. A framework for medical image retrieval using machine learning and statistical similarity matching techniques with relevance feedback.

    PubMed

    Rahman, Md Mahmudur; Bhattacharya, Prabir; Desai, Bipin C

    2007-01-01

    A content-based image retrieval (CBIR) framework for diverse collection of medical images of different imaging modalities, anatomic regions with different orientations and biological systems is proposed. Organization of images in such a database (DB) is well defined with predefined semantic categories; hence, it can be useful for category-specific searching. The proposed framework consists of machine learning methods for image prefiltering, similarity matching using statistical distance measures, and a relevance feedback (RF) scheme. To narrow down the semantic gap and increase the retrieval efficiency, we investigate both supervised and unsupervised learning techniques to associate low-level global image features (e.g., color, texture, and edge) in the projected PCA-based eigenspace with their high-level semantic and visual categories. Specially, we explore the use of a probabilistic multiclass support vector machine (SVM) and fuzzy c-mean (FCM) clustering for categorization and prefiltering of images to reduce the search space. A category-specific statistical similarity matching is proposed in a finer level on the prefiltered images. To incorporate a better perception subjectivity, an RF mechanism is also added to update the query parameters dynamically and adjust the proposed matching functions. Experiments are based on a ground-truth DB consisting of 5000 diverse medical images of 20 predefined categories. Analysis of results based on cross-validation (CV) accuracy and precision-recall for image categorization and retrieval is reported. It demonstrates the improvement, effectiveness, and efficiency achieved by the proposed framework.

  11. A statistical mechanics approach to autopoietic immune networks

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Agliari, Elena

    2010-07-01

    In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.

  12. Metamodelling Messages Conveyed in Five Statistical Mechanical Textbooks from 1936 to 2001

    ERIC Educational Resources Information Center

    Niss, Martin

    2009-01-01

    Modelling is a significant aspect of doing physics and it is important how this activity is taught. This paper focuses on the explicit or implicit messages about modelling conveyed to the student in the treatments of phase transitions in statistical mechanics textbooks at beginning graduate level. Five textbooks from the 1930s to the present are…

  13. Statistical Characterization of the Mechanical Parameters of Intact Rock Under Triaxial Compression: An Experimental Proof of the Jinping Marble

    NASA Astrophysics Data System (ADS)

    Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo

    2016-12-01

    We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.

  14. Statistical Use of Argonaute Expression and RISC Assembly in microRNA Target Identification

    PubMed Central

    Stanhope, Stephen A.; Sengupta, Srikumar; den Boon, Johan; Ahlquist, Paul; Newton, Michael A.

    2009-01-01

    MicroRNAs (miRNAs) posttranscriptionally regulate targeted messenger RNAs (mRNAs) by inducing cleavage or otherwise repressing their translation. We address the problem of detecting m/miRNA targeting relationships in homo sapiens from microarray data by developing statistical models that are motivated by the biological mechanisms used by miRNAs. The focus of our modeling is the construction, activity, and mediation of RNA-induced silencing complexes (RISCs) competent for targeted mRNA cleavage. We demonstrate that regression models accommodating RISC abundance and controlling for other mediating factors fit the expression profiles of known target pairs substantially better than models based on m/miRNA expressions alone, and lead to verifications of computational target pair predictions that are more sensitive than those based on marginal expression levels. Because our models are fully independent of exogenous results from sequence-based computational methods, they are appropriate for use as either a primary or secondary source of information regarding m/miRNA target pair relationships, especially in conjunction with high-throughput expression studies. PMID:19779550

  15. Statistical mechanics of economics I

    NASA Astrophysics Data System (ADS)

    Kusmartsev, F. V.

    2011-02-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  16. Single-Case Experimental Designs to Evaluate Novel Technology-Based Health Interventions

    PubMed Central

    Cassidy, Rachel N; Raiff, Bethany R

    2013-01-01

    Technology-based interventions to promote health are expanding rapidly. Assessing the preliminary efficacy of these interventions can be achieved by employing single-case experiments (sometimes referred to as n-of-1 studies). Although single-case experiments are often misunderstood, they offer excellent solutions to address the challenges associated with testing new technology-based interventions. This paper provides an introduction to single-case techniques and highlights advances in developing and evaluating single-case experiments, which help ensure that treatment outcomes are reliable, replicable, and generalizable. These advances include quality control standards, heuristics to guide visual analysis of time-series data, effect size calculations, and statistical analyses. They also include experimental designs to isolate the active elements in a treatment package and to assess the mechanisms of behavior change. The paper concludes with a discussion of issues related to the generality of findings derived from single-case research and how generality can be established through replication and through analysis of behavioral mechanisms. PMID:23399668

  17. Physical mechanism and numerical simulation of the inception of the lightning upward leader

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Qingmin; Lu Xinchang; Shi Wei

    2012-12-15

    The upward leader is a key physical process of the leader progression model of lightning shielding. The inception mechanism and criterion of the upward leader need further understanding and clarification. Based on leader discharge theory, this paper proposes the critical electric field intensity of the stable upward leader (CEFISUL) and characterizes it by the valve electric field intensity on the conductor surface, E{sub L}, which is the basis of a new inception criterion for the upward leader. Through numerical simulation under various physical conditions, we verified that E{sub L} is mainly related to the conductor radius, and data fitting yieldsmore » the mathematical expression of E{sub L}. We further establish a computational model for lightning shielding performance of the transmission lines based on the proposed CEFISUL criterion, which reproduces the shielding failure rate of typical UHV transmission lines. The model-based calculation results agree well with the statistical data from on-site operations, which show the effectiveness and validity of the CEFISUL criterion.« less

  18. Biomechanics and energetics of walking in powered ankle exoskeletons using myoelectric control versus mechanically intrinsic control.

    PubMed

    Koller, Jeffrey R; Remy, C David; Ferris, Daniel P

    2018-05-25

    Controllers for assistive robotic devices can be divided into two main categories: controllers using neural signals and controllers using mechanically intrinsic signals. Both approaches are prevalent in research devices, but a direct comparison between the two could provide insight into their relative advantages and disadvantages. We studied subjects walking with robotic ankle exoskeletons using two different control modes: dynamic gain proportional myoelectric control based on soleus muscle activity (neural signal), and timing-based mechanically intrinsic control based on gait events (mechanically intrinsic signal). We hypothesized that subjects would have different measures of metabolic work rate between the two controllers as we predicted subjects would use each controller in a unique manner due to one being dependent on muscle recruitment and the other not. The two controllers had the same average actuation signal as we used the control signals from walking with the myoelectric controller to shape the mechanically intrinsic control signal. The difference being the myoelectric controller allowed step-to-step variation in the actuation signals controlled by the user's soleus muscle recruitment while the timing-based controller had the same actuation signal with each step regardless of muscle recruitment. We observed no statistically significant difference in metabolic work rate between the two controllers. Subjects walked with 11% less soleus activity during mid and late stance and significantly less peak soleus recruitment when using the timing-based controller than when using the myoelectric controller. While walking with the myoelectric controller, subjects walked with significantly higher average positive and negative total ankle power compared to walking with the timing-based controller. We interpret the reduced ankle power and muscle activity with the timing-based controller relative to the myoelectric controller to result from greater slacking effects. Subjects were able to be less engaged on a muscle level when using a controller driven by mechanically intrinsic signals than when using a controller driven by neural signals, but this had no affect on their metabolic work rate. These results suggest that the type of controller (neural vs. mechanical) is likely to affect how individuals use robotic exoskeletons for therapeutic rehabilitation or human performance augmentation.

  19. On the connection between financial processes with stochastic volatility and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Queirós, S. M. D.; Tsallis, C.

    2005-11-01

    The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).

  20. Mechanics and statistics of the worm-like chain

    NASA Astrophysics Data System (ADS)

    Marantan, Andrew; Mahadevan, L.

    2018-02-01

    The worm-like chain model is a simple continuum model for the statistical mechanics of a flexible polymer subject to an external force. We offer a tutorial introduction to it using three approaches. First, we use a mesoscopic view, treating a long polymer (in two dimensions) as though it were made of many groups of correlated links or "clinks," allowing us to calculate its average extension as a function of the external force via scaling arguments. We then provide a standard statistical mechanics approach, obtaining the average extension by two different means: the equipartition theorem and the partition function. Finally, we work in a probabilistic framework, taking advantage of the Gaussian properties of the chain in the large-force limit to improve upon the previous calculations of the average extension.

  1. Certification Can Count: The Case of Aircraft Mechanics. Issues in Labor Statistics. Summary 02-03.

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics, Washington, DC.

    This document is a summary of aerospace industry technician statistics gathered by the Occupational Employment Statistics Survey for the year 2000 by the Department of Labor, Bureau of Labor Statistics. The data includes the following: (1) a comparison of wages earned by Federal Aviation Administration (FAA) certified and non-FAA certified…

  2. High accuracy in knee alignment and implant placement in unicompartmental medial knee replacement when using patient-specific instrumentation.

    PubMed

    Volpi, P; Prospero, E; Bait, C; Cervellin, M; Quaglia, A; Redaelli, A; Denti, M

    2015-05-01

    The influence of patient-specific instrumentations on the accuracy of unicompartmental medial knee replacement remains unclear. The goal of this study was to examine the ability of patient-specific instrumentation to accurately reproduce postoperatively what the surgeon had planned preoperatively. Twenty consecutive patients (20 knees) who suffered from isolated unicompartmental medial osteoarthritis of the knee and underwent medial knee replacement using newly introduced magnetic resonance imaging-based patient-specific instrumentation were assessed. This assessment recorded the following parameters: (1) the planned and the postoperative mechanical axis acquired through long-leg AP view radiographies; (2) the planned and the postoperative tibial slope acquired by means of standard AP and lateral view radiographies; and (3) the postoperative fit of the implanted components to the bone in coronal and sagittal planes. The hypothesis of the study was that there was no statistically significant difference between postoperative results and preoperatively planned values. The study showed that (1) the difference between the postoperative mechanical axis (mean 1.9° varus ± 1.2° SD) and the planned mechanical axis (mean 1.8° varus ± 1.2° SD) was not statistically significant; (2) the difference between the postoperative tibial slope (mean 5.2° ± 0.6° SD) and the planned tibial slope (mean 5.4° ± 0.6° SD) was statistically significant (p = 0.008); and (3) the postoperative component fit to bone in the coronal and sagittal planes was accurate in all cases; nevertheless, in one knee, all components were implanted one size smaller than preoperatively planned. Moreover, in two additional cases, one size thinner and one size thicker of the polyethylene insert were used. This study suggests that overall patient-specific instrumentation was highly accurate in reproducing postoperatively what the surgeon had planned preoperatively in terms of mechanical axis, tibial slope and component fit to bone. IV.

  3. A statistical mechanics approach to computing rare transitions in multi-stable turbulent geophysical flows

    NASA Astrophysics Data System (ADS)

    Laurie, J.; Bouchet, F.

    2012-04-01

    Many turbulent flows undergo sporadic random transitions, after long periods of apparent statistical stationarity. For instance, paths of the Kuroshio [1], the Earth's magnetic field reversal, atmospheric flows [2], MHD experiments [3], 2D turbulence experiments [4,5], 3D flows [6] show this kind of behavior. The understanding of this phenomena is extremely difficult due to the complexity, the large number of degrees of freedom, and the non-equilibrium nature of these turbulent flows. It is however a key issue for many geophysical problems. A straightforward study of these transitions, through a direct numerical simulation of the governing equations, is nearly always impracticable. This is mainly a complexity problem, due to the large number of degrees of freedom involved for genuine turbulent flows, and the extremely long time between two transitions. In this talk, we consider two-dimensional and geostrophic turbulent models, with stochastic forces. We consider regimes where two or more attractors coexist. As an alternative to direct numerical simulation, we propose a non-equilibrium statistical mechanics approach to the computation of this phenomenon. Our strategy is based on large deviation theory [7], derived from a path integral representation of the stochastic process. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable one. Moreover, we also determine the transition rates, and in which cases this most probable trajectory is a typical one. Interestingly, we prove that in the class of models we consider, a mechanism exists for diffusion over sets of connected attractors. For the type of stochastic forces that allows this diffusion, the transition between attractors is not a rare event. It is then very difficult to characterize the flow as bistable. However for another class of stochastic forces, this diffusion mechanism is prevented, and genuine bistability or multi-stability is observed. We discuss how these results are probably connected to the long debated existence of multi-stability in the atmosphere and oceans.

  4. Statistical Tools And Artificial Intelligence Approaches To Predict Fracture In Bulk Forming Processes

    NASA Astrophysics Data System (ADS)

    Di Lorenzo, R.; Ingarao, G.; Fonti, V.

    2007-05-01

    The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.

  5. Understanding the drug release mechanism from a montmorillonite matrix and its binary mixture with a hydrophilic polymer using a compartmental modelling approach

    NASA Astrophysics Data System (ADS)

    Choiri, S.; Ainurofiq, A.

    2018-03-01

    Drug release from a montmorillonite (MMT) matrix is a complex mechanism controlled by swelling mechanism of MMT and an interaction of drug and MMT. The aim of this research was to explain a suitable model of the drug release mechanism from MMT and its binary mixture with a hydrophilic polymer in the controlled release formulation based on a compartmental modelling approach. Theophylline was used as a drug model and incorporated into MMT and a binary mixture with hydroxyl propyl methyl cellulose (HPMC) as a hydrophilic polymer, by a kneading method. The dissolution test was performed and the modelling of drug release was assisted by a WinSAAM software. A 2 model was purposed based on the swelling capability and basal spacing of MMT compartments. The model evaluation was carried out to goodness of fit and statistical parameters and models were validated by a cross-validation technique. The drug release from MMT matrix regulated by a burst release mechanism of unloaded drug, swelling ability, basal spacing of MMT compartment, and equilibrium between basal spacing and swelling compartments. Furthermore, the addition of HPMC in MMT system altered the presence of swelling compartment and equilibrium between swelling and basal spacing compartment systems. In addition, a hydrophilic polymer reduced the burst release mechanism of unloaded drug.

  6. SurfKin: an ab initio kinetic code for modeling surface reactions.

    PubMed

    Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K

    2014-10-05

    In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.

  7. ABS-FishCount: An Agent-Based Simulator of Underwater Sensors for Measuring the Amount of Fish.

    PubMed

    García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime

    2017-11-13

    Underwater sensors provide one of the possibilities to explore oceans, seas, rivers, fish farms and dams, which all together cover most of our planet's area. Simulators can be helpful to test and discover some possible strategies before implementing these in real underwater sensors. This speeds up the development of research theories so that these can be implemented later. In this context, the current work presents an agent-based simulator for defining and testing strategies for measuring the amount of fish by means of underwater sensors. The current approach is illustrated with the definition and assessment of two strategies for measuring fish. One of these two corresponds to a simple control mechanism, while the other is an experimental strategy and includes an implicit coordination mechanism. The experimental strategy showed a statistically significant improvement over the control one in the reduction of errors with a large Cohen's d effect size of 2.55.

  8. Effects of different mechanized soil fertilization methods on corn nutrient accumulation and yield

    NASA Astrophysics Data System (ADS)

    Shi, Qingwen; Bai, Chunming; Wang, Huixin; Wu, Di; Song, Qiaobo; Dong, Zengqi; Gao, Depeng; Dong, Qiping; Cheng, Xin; Zhang, Yahao; Mu, Jiahui; Chen, Qinghong; Liao, Wenqing; Qu, Tianru; Zhang, Chunling; Zhang, Xinyu; Liu, Yifei; Han, Xiaori

    2017-05-01

    Aim: Experiments for mechanized corn soil fertilization were conducted in Faku demonstration zone. On this basis, we studied effects on corn nutrient accumulation and yield traits at brown soil regions due to different mechanized soil fertilization measures. We also evaluated and optimized the regulation effects of mechanized soil fertilization for the purpose of crop yield increase and production efficiency improvement. Method: Based on the survey of soil background value in the demonstration zone, we collected plant samples during different corn growth periods to determine and make statistical analysis. Conclusions: Decomposed cow dung, when under mechanical broadcasting, was able to remarkably increase nitrogen and potassium accumulation content of corns at their ripe stage. Crushed stalk returning combined with deep tillage would remarkably increase phosphorus accumulation content of corn plants. When compared with top application, crushed stalk returning combined with deep tillage would remarkably increase corn thousand kernel weight (TKW). Mechanized broadcasting of granular organic fertilizer and crushed stalk returning combined with deep tillage, when compared with surface application, were able to boost corn yield in the in the demonstration zone.

  9. Retrocausal Effects as a Consequence of Quantum Mechanics Refined to Accommodate the Principle of Sufficient Reason

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stapp, Henry P.

    2011-05-10

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determinedmore » by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.« less

  10. Neonatal Risk Factors for Treatment-Demanding Retinopathy of Prematurity: A Danish National Study.

    PubMed

    Slidsborg, Carina; Jensen, Aksel; Forman, Julie Lyng; Rasmussen, Steen; Bangsgaard, Regitze; Fledelius, Hans Callø; Greisen, Gorm; la Cour, Morten

    2016-04-01

    One goal of the study was to identify "new" statistically independent risk factors for treatment-demanding retinopathy of prematurity (ROP). Another goal was to evaluate whether any new risk factors could explain the increase in the incidence of treatment-demanding ROP over time in Denmark. A retrospective, register-based cohort study. The study included premature infants (n = 6490) born in Denmark from 1997 to 2008. The study sample and the 31 candidate risk factors were identified in 3 national registers. Data were linked through a unique civil registration number. Each of the 31 candidate risk factors were evaluated in univariate analyses, while adjusted for known risk factors (i.e., gestational age [GA] at delivery, small for gestational age [SGA], multiple births, and male sex). Significant outcomes were analyzed thereafter in a backward selection multiple logistic regression model. Treatment-demanding ROP and its associations to candidate risk factors. Mechanical ventilation (odds ratio [OR], 2.84; 95% confidence interval [CI], 1.99-4.08; P < 0.01) and blood transfusion (OR, 1.97; 95% CI, 1.20-3.14; P = 0.01) were the only new statistically independent risk factors, in addition to GA at delivery, SGA, multiple births, and male sex. Modification in these prognostic factors for ROP did not cause an increase in treatment-demanding ROP. In a large study population, blood transfusion and mechanical ventilation were the only new statistically independent risk factors to predict the development of treatment-demanding ROP. Modification in the neonatal treatment with mechanical ventilation or blood transfusion did not cause the observed increase in the incidence of preterm infants with treatment-demanding ROP during a recent birth period (2003-2008). Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  11. Harmonic wavelet packet transform for on-line system health diagnosis

    NASA Astrophysics Data System (ADS)

    Yan, Ruqiang; Gao, Robert X.

    2004-07-01

    This paper presents a new approach to on-line health diagnosis of mechanical systems, based on the wavelet packet transform. Specifically, signals acquired from vibration sensors are decomposed into sub-bands by means of the discrete harmonic wavelet packet transform (DHWPT). Based on the Fisher linear discriminant criterion, features in the selected sub-bands are then used as inputs to three classifiers (Nearest Neighbor rule-based and two Neural Network-based), for system health condition assessment. Experimental results have confirmed that, comparing to the conventional approach where statistical parameters from raw signals are used, the presented approach enabled higher signal-to-noise ratio for more effective and intelligent use of the sensory information, thus leading to more accurate system health diagnosis.

  12. ZERODUR strength modeling with Weibull statistical distributions

    NASA Astrophysics Data System (ADS)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a large data set. With only 20 specimens per sample such differentiation is not possible. This requires 100 specimens per set, the more the better. The validity of the statistical evaluation methods is discussed with several examples. These considerations are of special importance because of their consequences on the prognosis methods and results. Especially the use of the two parameter Weibull distribution for high strength surfaces has led to non-realistic results. Extrapolation down to low acceptable probability of failure covers a wide range without data points existing and is mainly influenced by the slope determined by the high strength specimens. In the past this misconception has prevented the use of brittle materials for stress loads, which they could have endured easily.

  13. A Methodology for Conducting Space Utilization Studies within Department of Defense Medical Facilities

    DTIC Science & Technology

    1992-07-01

    database programs, such as dBase or Microsoft Excell, to yield statistical reports that can profile the health care facility . Ladeen (1989) feels that the...service specific space status report would be beneficial to the specific service(s) under study, it would not provide sufficient data for facility -wide...change in the Master Space Plan. The revised methodology also provides a mechanism and forum for spuce management education within the facility . The

  14. Statistical-Mechanics-Inspired Optimization of Sensor Field Configuration for Detection of Mobile Targets (PREPRINT)

    DTIC Science & Technology

    2010-11-01

    pected target motion. Along this line, Wettergren [5] analyzed the performance of the track - before - detect schemes for the sensor networks. Furthermore...dressed by Baumgartner and Ferrari [11] for the reorganization of the sensor field to achieve the maximum coverage. The track - before - detect -based optimal...confirming a target. In accordance with the track - before - detect paradigm [4], a moving target is detected if the kd (typically kd = 3 or 4) sensors detect

  15. Thermodynamic evolution far from equilibrium

    NASA Astrophysics Data System (ADS)

    Khantuleva, Tatiana A.

    2018-05-01

    The presented model of thermodynamic evolution of an open system far from equilibrium is based on the modern results of nonequilibrium statistical mechanics, the nonlocal theory of nonequilibrium transport developed by the author and the Speed Gradient principle introduced in the theory of adaptive control. Transition to a description of the system internal structure evolution at the mesoscopic level allows a new insight at the stability problem of non-equilibrium processes. The new model is used in a number of specific tasks.

  16. Unbiased estimators for spatial distribution functions of classical fluids

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.; Jarzynski, Christopher

    2005-01-01

    We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.

  17. Factors that enable and hinder the implementation of projects in the alcohol and other drug field.

    PubMed

    MacLean, Sarah; Berends, Lynda; Hunter, Barbara; Roberts, Bridget; Mugavin, Janette

    2012-02-01

    Few studies systematically explore elements of successful project implementation across a range of alcohol and other drug (AOD) activities. This paper provides an evidence base to inform project implementation in the AOD field. We accessed records for 127 completed projects funded by the Alcohol, Education and Rehabilitation Foundation from 2002 to 2008. An adapted realist synthesis methodology enabled us to develop categories of enablers and barriers to successful project implementation, and to identify factors statistically associated with successful project implementation, defined as meeting all funding objectives. Thematic analysis of eight case study projects allowed detailed exploration of findings. Nine enabler and 10 barrier categories were identified. Those most frequently reported as both barriers and enablers concerned partnerships with external agencies and communities, staffing and project design. Achieving supportive relationships with partner agencies and communities, employing skilled staff and implementing consumer or participant input mechanisms were statistically associated with successful project implementation. The framework described here will support development of evidence-based project funding guidelines and project performance indicators. The study provides evidence that investing project hours and resources to develop robust relationships with project partners and communities, implementing mechanisms for consumer or participant input and attracting skilled staff are legitimate and important activities, not just in themselves but because they potentially influence achievement of project funding objectives. © 2012 The Authors. ANZJPH © 2012 Public Health Association of Australia.

  18. Do the methods used to analyse missing data really matter? An examination of data from an observational study of Intermediate Care patients.

    PubMed

    Kaambwa, Billingsley; Bryan, Stirling; Billingham, Lucinda

    2012-06-27

    Missing data is a common statistical problem in healthcare datasets from populations of older people. Some argue that arbitrarily assuming the mechanism responsible for the missingness and therefore the method for dealing with this missingness is not the best option-but is this always true? This paper explores what happens when extra information that suggests that a particular mechanism is responsible for missing data is disregarded and methods for dealing with the missing data are chosen arbitrarily. Regression models based on 2,533 intermediate care (IC) patients from the largest evaluation of IC done and published in the UK to date were used to explain variation in costs, EQ-5D and Barthel index. Three methods for dealing with missingness were utilised, each assuming a different mechanism as being responsible for the missing data: complete case analysis (assuming missing completely at random-MCAR), multiple imputation (assuming missing at random-MAR) and Heckman selection model (assuming missing not at random-MNAR). Differences in results were gauged by examining the signs of coefficients as well as the sizes of both coefficients and associated standard errors. Extra information strongly suggested that missing cost data were MCAR. The results show that MCAR and MAR-based methods yielded similar results with sizes of most coefficients and standard errors differing by less than 3.4% while those based on MNAR-methods were statistically different (up to 730% bigger). Significant variables in all regression models also had the same direction of influence on costs. All three mechanisms of missingness were shown to be potential causes of the missing EQ-5D and Barthel data. The method chosen to deal with missing data did not seem to have any significant effect on the results for these data as they led to broadly similar conclusions with sizes of coefficients and standard errors differing by less than 54% and 322%, respectively. Arbitrary selection of methods to deal with missing data should be avoided. Using extra information gathered during the data collection exercise about the cause of missingness to guide this selection would be more appropriate.

  19. Journey Through Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Yang, C. N.

    2013-05-01

    My first involvement with statistical mechanics and the many body problem was when I was a student at The National Southwest Associated University in Kunming during the war. At that time Professor Wang Zhu-Xi had just come back from Cambridge, England, where he was a student of Fowler, and his thesis was on phase transitions, a hot topic at that time, and still a very hot topic today...

  20. Grand canonical validation of the bipartite international trade network.

    PubMed

    Straka, Mika J; Caldarelli, Guido; Saracco, Fabio

    2017-08-01

    Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.

  1. Grand canonical validation of the bipartite international trade network

    NASA Astrophysics Data System (ADS)

    Straka, Mika J.; Caldarelli, Guido; Saracco, Fabio

    2017-08-01

    Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.

  2. Lévy meets poisson: a statistical artifact may lead to erroneous recategorization of Lévy walk as Brownian motion.

    PubMed

    Gautestad, Arild O

    2013-03-01

    The flow of GPS data on animal space is challenging old paradigms, such as the issue of the scale-free Lévy walk versus scale-specific Brownian motion. Since these movement classes often require different protocols with respect to ecological analyses, further theoretical development in this field is important. I describe central concepts such as scale-specific versus scale-free movement and the difference between mechanistic and statistical-mechanical levels of analysis. Next, I report how a specific sampling scheme may have produced much confusion: a Lévy walk may be wrongly categorized as Brownian motion if the duration of a move, or bout, is used as a proxy for step length and a move is subjectively defined. Hence, the categorization and recategorization of movement class compliance surrounding the Lévy walk controversy may have been based on a statistical artifact. This issue may be avoided by collecting relocations at a fixed rate at a temporal scale that minimizes over- and undersampling.

  3. Superstatistical Energy Distributions of an Ion in an Ultracold Buffer Gas

    NASA Astrophysics Data System (ADS)

    Rouse, I.; Willitsch, S.

    2017-04-01

    An ion in a radio frequency ion trap interacting with a buffer gas of ultracold neutral atoms is a driven dynamical system which has been found to develop a nonthermal energy distribution with a power law tail. The exact analytical form of this distribution is unknown, but has often been represented empirically by q -exponential (Tsallis) functions. Based on the concepts of superstatistics, we introduce a framework for the statistical mechanics of an ion trapped in an rf field subject to collisions with a buffer gas. We derive analytic ion secular energy distributions from first principles both neglecting and including the effects of the thermal energy of the buffer gas. For a buffer gas with a finite temperature, we prove that Tsallis statistics emerges from the combination of a constant heating term and multiplicative energy fluctuations. We show that the resulting distributions essentially depend on experimentally controllable parameters paving the way for an accurate control of the statistical properties of ion-atom hybrid systems.

  4. Learning Across Senses: Cross-Modal Effects in Multisensory Statistical Learning

    PubMed Central

    Mitchel, Aaron D.; Weiss, Daniel J.

    2014-01-01

    It is currently unknown whether statistical learning is supported by modality-general or modality-specific mechanisms. One issue within this debate concerns the independence of learning in one modality from learning in other modalities. In the present study, the authors examined the extent to which statistical learning across modalities is independent by simultaneously presenting learners with auditory and visual streams. After establishing baseline rates of learning for each stream independently, they systematically varied the amount of audiovisual correspondence across 3 experiments. They found that learners were able to segment both streams successfully only when the boundaries of the audio and visual triplets were in alignment. This pattern of results suggests that learners are able to extract multiple statistical regularities across modalities provided that there is some degree of cross-modal coherence. They discuss the implications of their results in light of recent claims that multisensory statistical learning is guided by modality-independent mechanisms. PMID:21574745

  5. Statistical process control applied to mechanized peanut sowing as a function of soil texture.

    PubMed

    Zerbato, Cristiano; Furlani, Carlos Eduardo Angeli; Ormond, Antonio Tassio Santana; Gírio, Lucas Augusto da Silva; Carneiro, Franciele Morlin; da Silva, Rouverson Pereira

    2017-01-01

    The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing.

  6. Fundamentals of poly(lactic acid) microstructure, crystallization behavior, and properties

    NASA Astrophysics Data System (ADS)

    Kang, Shuhui

    Poly(lactic acid) is an environmentally-benign biodegradable and sustainable thermoplastic material, which has found broad applications as food packaging films and as non-woven fibers. The crystallization and deformation mechanisms of the polymer are largely determined by the distribution of conformation and configuration. Knowledge of these mechanisms is needed to understand the mechanical and thermal properties on which processing conditions mainly depend. In conjunction with laser light scattering, Raman spectroscopy and normal coordinate analysis are used in this thesis to elucidate these properties. Vibrational spectroscopic theory, Flory's rotational isomeric state (RIS) theory, Gaussian chain statistics and statistical mechanics are used to relate experimental data to molecular chain structure. A refined RIS model is proposed, chain rigidity recalculated and chain statistics discussed. A Raman spectroscopic characterization method for crystalline and amorphous phase orientation has been developed. A shrinkage model is also proposed to interpret the dimensional stability for fibers and uni- or biaxially stretched films. A study of stereocomplexation formed by poly(l-lactic acid) and poly(d-lactic acid) is also presented.

  7. Statistical process control applied to mechanized peanut sowing as a function of soil texture

    PubMed Central

    Furlani, Carlos Eduardo Angeli; da Silva, Rouverson Pereira

    2017-01-01

    The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing. PMID:28742095

  8. A practical approach for the scale-up of roller compaction process.

    PubMed

    Shi, Weixian; Sprockel, Omar L

    2016-09-01

    An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Fourth-power law structure of the shock wave fronts in metals and ceramics

    NASA Astrophysics Data System (ADS)

    Bayandin, Yuriy; Naimark, Oleg; Saveleva, Natalia

    2017-06-01

    The plate impact experiments were performed for solids during last fifty years. It was established that the dependence between the strain rate and the shock wave amplitude for metals and ceramics expressed by a fourth-power law. Present study is focused on the theoretical investigation and numerical simulation of plane shock wave propagation in metals and ceramics. Statistically based constitutive model of solid with defects (microcracks and microshears) was developed to provide the relation between damage induced mechanisms of structural relaxation, thermally activated plastic flow and material reactions for extreme loading conditions. Original approach based on the wide range constitutive equations was proposed for the numerical simulation of multiscale damage-failure transition mechanisms and plane shock wave propagation in solids with defects in the range of strain rate 103 -108s-1 . It was shown that mechanisms of plastic relaxation and damage-failure transitions are linked to the multiscale kinetics of defects leading to the self-similar nature of shock wave fronts in metals and ceramics. The work was supported by the Russian Science Foundation (Project No. 14-19-01173).

  10. InGaAs tunnel diodes for the calibration of semi-classical and quantum mechanical band-to-band tunneling models

    NASA Astrophysics Data System (ADS)

    Smets, Quentin; Verreck, Devin; Verhulst, Anne S.; Rooyackers, Rita; Merckling, Clément; Van De Put, Maarten; Simoen, Eddy; Vandervorst, Wilfried; Collaert, Nadine; Thean, Voon Y.; Sorée, Bart; Groeseneken, Guido; Heyns, Marc M.

    2014-05-01

    Promising predictions are made for III-V tunnel-field-effect transistor (FET), but there is still uncertainty on the parameters used in the band-to-band tunneling models. Therefore, two simulators are calibrated in this paper; the first one uses a semi-classical tunneling model based on Kane's formalism, and the second one is a quantum mechanical simulator implemented with an envelope function formalism. The calibration is done for In0.53Ga0.47As using several p+/intrinsic/n+ diodes with different intrinsic region thicknesses. The dopant profile is determined by SIMS and capacitance-voltage measurements. Error bars are used based on statistical and systematic uncertainties in the measurement techniques. The obtained parameters are in close agreement with theoretically predicted values and validate the semi-classical and quantum mechanical models. Finally, the models are applied to predict the input characteristics of In0.53Ga0.47As n- and p-lineTFET, with the n-lineTFET showing competitive performance compared to MOSFET.

  11. Evaluation of phenyl-propanedione on yellowing and chemical-mechanical properties of experimental dental resin-based materials.

    PubMed

    Oliveira, Dayane Carvalho Ramos Salles de; Souza-Junior, Eduardo José; Dobson, Adam; Correr, Ana Rosa Costa; Brandt, William Cunha; Sinhoreti, Mário Alexandre Coelho

    2016-01-01

    To evaluate the influence of phenyl-propanedione on yellowing and chemical-mechanical properties of experimental resin-based materials photoactivated using different light curing units (LCUs). Experimental resin-based materials with the same organic matrix (60:40 wt% BisGMA:TEGDMA) were mechanically blended using a centrifugal mixing device. To this blend, different photoinitiator systems were added in equimolar concentrations with aliphatic amine doubled by wt%: 0.4 wt% CQ; 0.38 wt% PPD; or 0.2 wt% CQ and 0.19 wt% PPD. The degree of conversion (DC), flexural strength (FS), Young's modulus (YM), Knoop hardness (KNH), crosslinking density (CLD), and yellowing (Y) were evaluated (n=10). All samples were light cured with the following LCUs: a halogen lamp (XL 2500), a monowave LED (Radii), or a polywave LED (Valo) with 16 J/cm2. The results were analysed by two-way ANOVA and Tukey's test (α=0.05). No statistical differences were found between the different photoinitiator systems to KNH, CLS, FS, and YM properties (p≥0.05). PPD/CQ association showed the higher DC values compared with CQ and PPD isolated systems when photoactivated by a polywave LED (p≤0.05). Y values were highest for the CQ compared with the PPD systems (p≤0.05). PPD isolated system promoted similar chemical and mechanical properties and less yellowing compared with the CQ isolated system, regardless of the LCU used.

  12. APPLICATION OF STATISTICAL ENERGY ANALYSIS TO VIBRATIONS OF MULTI-PANEL STRUCTURES.

    DTIC Science & Technology

    cylindrical shell are compared with predictions obtained from statistical energy analysis . Generally good agreement is observed. The flow of mechanical...the coefficients of proportionality between power flow and average modal energy difference, which one must know in order to apply statistical energy analysis . No

  13. Modelling multiple sources of dissemination bias in meta-analysis.

    PubMed

    Bowden, Jack; Jackson, Dan; Thompson, Simon G

    2010-03-30

    Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.

  14. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  15. Lagrangian statistics across the turbulent-nonturbulent interface in a turbulent plane jet.

    PubMed

    Taveira, Rodrigo R; Diogo, José S; Lopes, Diogo C; da Silva, Carlos B

    2013-10-01

    Lagrangian statistics from millions of particles are used to study the turbulent entrainment mechanism in a direct numerical simulation of a turbulent plane jet at Re(λ) ≈ 110. The particles (tracers) are initially seeded at the irrotational region of the jet near the turbulent shear layer and are followed as they are drawn into the turbulent region across the turbulent-nonturbulent interface (TNTI), allowing the study of the enstrophy buildup and thereby characterizing the turbulent entrainment mechanism in the jet. The use of Lagrangian statistics following fluid particles gives a more correct description of the entrainment mechanism than in previous works since the statistics in relation to the TNTI position involve data from the trajectories of the entraining fluid particles. The Lagrangian statistics for the particles show the existence of a velocity jump and a characteristic vorticity jump (with a thickness which is one order of magnitude greater than the Kolmogorov microscale), in agreement with previous results using Eulerian statistics. The particles initially acquire enstrophy by viscous diffusion and later by enstrophy production, which becomes "active" only deep inside the turbulent region. Both enstrophy diffusion and production near the TNTI differ substantially from inside the turbulent region. Only about 1% of all particles find their way into pockets of irrotational flow engulfed into the turbulent shear layer region, indicating that "engulfment" is not significant for the present flow, indirectly suggesting that the entrainment is largely due to "nibbling" small-scale mechanisms acting along the entire TNTI surface. Probability density functions of particle positions suggests that the particles spend more time crossing the region near the TNTI than traveling inside the turbulent region, consistent with the particles moving tangent to the interface around the time they cross it.

  16. Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping

    NASA Astrophysics Data System (ADS)

    Kubica, Aleksander; Beverland, Michael E.; Brandão, Fernando; Preskill, John; Svore, Krysta M.

    2018-05-01

    Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p3DCC (1 )≃1.9 % and p3DCC (2 )≃27.6 % . We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.

  17. A new paradigm for the molecular basis of rubber elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, David E.; Barber, John L.

    The molecular basis for rubber elasticity is arguably the oldest and one of the most important questions in the field of polymer physics. The theoretical investigation of rubber elasticity began in earnest almost a century ago with the development of analytic thermodynamic models, based on simple, highly-symmetric configurations of so-called Gaussian chains, i.e. polymer chains that obey Markov statistics. Numerous theories have been proposed over the past 90 years based on the ansatz that the elastic force for individual network chains arises from the entropy change associated with the distribution of end-to-end distances of a free polymer chain. There aremore » serious philosophical objections to this assumption and others, such as the assumption that all network nodes undergo affine motion and that all of the network chains have the same length. Recently, a new paradigm for elasticity in rubber networks has been proposed that is based on mechanisms that originate at the molecular level. Using conventional statistical mechanics analyses, quantum chemistry, and molecular dynamics simulations, the fundamental entropic and enthalpic chain extension forces for polyisoprene (natural rubber) have been determined, along with estimates for the basic force constants. Concurrently, the complex morphology of natural rubber networks (the joint probability density distributions that relate the chain end-to-end distance to its contour length) has also been captured in a numerical model. When molecular chain forces are merged with the network structure in this model, it is possible to study the mechanical response to tensile and compressive strains of a representative volume element of a polymer network. As strain is imposed on a network, pathways of connected taut chains, that completely span the network along strain axis, emerge. Although these chains represent only a few percent of the total, they account for nearly all of the elastic stress at high strain. Here we provide a brief review of previous elasticity theories and their deficiencies, and present a new paradigm with an emphasis on experimental comparisons.« less

  18. A new paradigm for the molecular basis of rubber elasticity

    DOE PAGES

    Hanson, David E.; Barber, John L.

    2015-02-19

    The molecular basis for rubber elasticity is arguably the oldest and one of the most important questions in the field of polymer physics. The theoretical investigation of rubber elasticity began in earnest almost a century ago with the development of analytic thermodynamic models, based on simple, highly-symmetric configurations of so-called Gaussian chains, i.e. polymer chains that obey Markov statistics. Numerous theories have been proposed over the past 90 years based on the ansatz that the elastic force for individual network chains arises from the entropy change associated with the distribution of end-to-end distances of a free polymer chain. There aremore » serious philosophical objections to this assumption and others, such as the assumption that all network nodes undergo affine motion and that all of the network chains have the same length. Recently, a new paradigm for elasticity in rubber networks has been proposed that is based on mechanisms that originate at the molecular level. Using conventional statistical mechanics analyses, quantum chemistry, and molecular dynamics simulations, the fundamental entropic and enthalpic chain extension forces for polyisoprene (natural rubber) have been determined, along with estimates for the basic force constants. Concurrently, the complex morphology of natural rubber networks (the joint probability density distributions that relate the chain end-to-end distance to its contour length) has also been captured in a numerical model. When molecular chain forces are merged with the network structure in this model, it is possible to study the mechanical response to tensile and compressive strains of a representative volume element of a polymer network. As strain is imposed on a network, pathways of connected taut chains, that completely span the network along strain axis, emerge. Although these chains represent only a few percent of the total, they account for nearly all of the elastic stress at high strain. Here we provide a brief review of previous elasticity theories and their deficiencies, and present a new paradigm with an emphasis on experimental comparisons.« less

  19. To t-Test or Not to t-Test? A p-Values-Based Point of View in the Receiver Operating Characteristic Curve Framework.

    PubMed

    Vexler, Albert; Yu, Jihnhee

    2018-04-13

    A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.

  20. Essays on the statistical mechanics of the labor market and implications for the distribution of earned income

    NASA Astrophysics Data System (ADS)

    Schneider, Markus P. A.

    This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely the never married and women. The estimated parameter for never-married men's incomes is significantly different from the parameter estimated for never-married women, implying that either the combined distribution is not exponential or that the individual distributions are not exponential. However, it substantiates the existence of a persistent gender income gap among the never-married. References: Reich, M., D. M. Gordon, and R. C. Edwards (1973). A Theory of Labor Market Segmentation. Quarterly Journal of Economics 63, 359-365. Yakovenko, V. M. (2009). Econophysics, Statistical Mechanics Approach to. In R. A. Meyers (Ed.), Encyclopedia of Complexity and System Science. Springer.

  1. Chemomics-based marker compounds mining and mimetic processing for exploring chemical mechanisms in traditional processing of herbal medicines, a continuous study on Rehmanniae Radix.

    PubMed

    Zhou, Li; Xu, Jin-Di; Zhou, Shan-Shan; Shen, Hong; Mao, Qian; Kong, Ming; Zou, Ye-Ting; Xu, Ya-Yun; Xu, Jun; Li, Song-Lin

    2017-12-29

    Exploring processing chemistry, in particular the chemical transformation mechanisms involved, is a key step to elucidate the scientific basis in traditional processing of herbal medicines. Previously, taking Rehmanniae Radix (RR) as a case study, the holistic chemome (secondary metabolome and glycome) difference between raw and processed RR was revealed by integrating hyphenated chromatographic techniques-based targeted glycomics and untargeted metabolomics. Nevertheless, the complex chemical transformation mechanisms underpinning the holistic chemome variation in RR processing remain to be extensively clarified. As a continuous study, here a novel strategy by combining chemomics-based marker compounds mining and mimetic processing is proposed for further exploring the chemical mechanisms involved in herbal processing. First, the differential marker compounds between raw and processed herbs were rapidly discovered by untargeted chemomics-based mining approach through multivariate statistical analysis of the chemome data obtained by integrated metabolomics and glycomics analysis. Second, the marker compounds were mimetically processed under the simulated physicochemical conditions as in the herb processing, and the final reaction products were chemically characterized by targeted chemomics-based mining approach. Third, the main chemical transformation mechanisms involved were clarified by linking up the original marker compounds and their mimetic processing products. Using this strategy, a set of differential marker compounds including saccharides, glycosides and furfurals in raw and processed RR was rapidly found, and the major chemical mechanisms involved in RR processing were elucidated as stepwise transformations of saccharides (polysaccharides, oligosaccharides and monosaccharides) and glycosides (iridoid glycosides and phenethylalcohol glycosides) into furfurals (glycosylated/non-glycosylated hydroxymethylfurfurals) by deglycosylation and/or dehydration. The research deliverables indicated that the proposed strategy could advance the understanding of RR processing chemistry, and therefore may be considered a promising approach for delving into the scientific basis in traditional processing of herbal medicines. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Nonparametric identification of nonlinear dynamic systems using a synchronisation-based method

    NASA Astrophysics Data System (ADS)

    Kenderi, Gábor; Fidlin, Alexander

    2014-12-01

    The present study proposes an identification method for highly nonlinear mechanical systems that does not require a priori knowledge of the underlying nonlinearities to reconstruct arbitrary restoring force surfaces between degrees of freedom. This approach is based on the master-slave synchronisation between a dynamic model of the system as the slave and the real system as the master using measurements of the latter. As the model synchronises to the measurements, it becomes an observer of the real system. The optimal observer algorithm in a least-squares sense is given by the Kalman filter. Using the well-known state augmentation technique, the Kalman filter can be turned into a dual state and parameter estimator to identify parameters of a priori characterised nonlinearities. The paper proposes an extension of this technique towards nonparametric identification. A general system model is introduced by describing the restoring forces as bilateral spring-dampers with time-variant coefficients, which are estimated as augmented states. The estimation procedure is followed by an a posteriori statistical analysis to reconstruct noise-free restoring force characteristics using the estimated states and their estimated variances. Observability is provided using only one measured mechanical quantity per degree of freedom, which makes this approach less demanding in the number of necessary measurement signals compared with truly nonparametric solutions, which typically require displacement, velocity and acceleration signals. Additionally, due to the statistical rigour of the procedure, it successfully addresses signals corrupted by significant measurement noise. In the present paper, the method is described in detail, which is followed by numerical examples of one degree of freedom (1DoF) and 2DoF mechanical systems with strong nonlinearities of vibro-impact type to demonstrate the effectiveness of the proposed technique.

  3. Ethnicity of severe trauma patients: results of a population-based study, Auckland, New Zealand 2004.

    PubMed

    Creamer, Gowan; Civil, Ian; Ng, Alex; Adams, David; Cacala, Shas; Koelmeyer, Timothy; Thompson, John

    2010-06-11

    To investigate the role of Māori and Pacific ethnicity within the severe trauma and population demographics of Auckland, New Zealand. A population-based study utilising prospectively gathered trauma databases and coronial autopsy information. Population data was derived from Statistics New Zealand resident population projections for the year 2004. The geographic boundaries of the Auckland district health boards (Waitemata DHB, Auckland DHB and Counties-Manukau DHB). Severe injury was defined as death or injury severity score more than 15. Combining data from coronial autopsy and four hospital trauma databases provided age, gender, ethnicity, mechanism, mortality and hospitalisation information for severely injured Aucklanders. Māori and Pacific had increased risk of severe injury and injury-related mortality. A major gender difference is apparent: Māori female at increased risk and Pacific female at decreased risk compared to the remaining female population; both Māori and Pacific male have high severe injury rate than the remaining population. The relative risk for severe injury (and mortality) for Māori RR=2.38 (RR=2.80) and Pacific RR=1.49 (RR=1.59) is higher than the remaining population, the highest risk (and more statistically significant) is seen in the 15-29 age group (Māori RR=2.87, Pacific RR=2.57). Road traffic crashes account for the greatest proportion of injuries in all groups. Māori have relatively higher rates of hanging and assault-related injury and death; Pacific have relatively higher rates of falls and assault. Ethnicity is a factor in severe injury and mortality rates in Auckland. Age is an important influence on these rates. Although mechanism of injury varies between ethnic groups, no particular mechanism of injury accounts for the overall differences between groups.

  4. Fracture behavior of metal-ceramic fixed dental prostheses with frameworks from cast or a newly developed sintered cobalt-chromium alloy.

    PubMed

    Krug, Klaus-Peter; Knauber, Andreas W; Nothdurft, Frank P

    2015-03-01

    The aim of this study was to investigate the fracture behavior of metal-ceramic bridges with frameworks from cobalt-chromium-molybdenum (CoCrMo), which are manufactured using conventional casting or a new computer-aided design/computer-aided manufacturing (CAD/CAM) milling and sintering technique. A total of 32 metal-ceramic fixed dental prostheses (FDPs), which are based on a nonprecious metal framework, was produced using a conventional casting process (n = 16) or a new CAD/CAM milling and sintering process (n = 16). Eight unveneered frameworks were manufactured using each of the techniques. After thermal and mechanical aging of half of the restorations, all samples were subjected to a static loading test in a universal testing machine, in which acoustic emission monitoring was performed. Three different critical forces were revealed: the fracture force (F max), the force at the first reduction in force (F decr1), and the force at the critical acoustic event (F acoust1). With the exception of the veneered restorations with cast or sintered metal frameworks without artificial aging, which presented a statistically significant but slightly different F max, no statistically significant differences between cast and CAD/CAM sintered and milled FDPs were detected. Thermal and mechanical loading did not significantly affect the resulting forces. Cast and CAD/CAM milled and sintered metal-ceramic bridges were determined to be comparable with respect to the fracture behavior. FDPs based on CAD/CAM milled and sintered frameworks may be an applicable and less technique-sensitive alternative to frameworks that are based on conventionally cast frameworks.

  5. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  6. DNA viewed as an out-of-equilibrium structure

    NASA Astrophysics Data System (ADS)

    Provata, A.; Nicolis, C.; Nicolis, G.

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ2 tests shows that DNA can not be described as a low order Markov chain of order up to r =6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  7. Statistical and Detailed Analysis on Fiber Reinforced Self-Compacting Concrete Containing Admixtures- A State of Art of Review

    NASA Astrophysics Data System (ADS)

    Athiyamaan, V.; Mohan Ganesh, G.

    2017-11-01

    Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.

  8. DNA viewed as an out-of-equilibrium structure.

    PubMed

    Provata, A; Nicolis, C; Nicolis, G

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ^{2} tests shows that DNA can not be described as a low order Markov chain of order up to r=6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  9. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  10. Infants Segment Continuous Events Using Transitional Probabilities

    ERIC Educational Resources Information Center

    Stahl, Aimee E.; Romberg, Alexa R.; Roseberry, Sarah; Golinkoff, Roberta Michnick; Hirsh-Pasek, Kathryn

    2014-01-01

    Throughout their 1st year, infants adeptly detect statistical structure in their environment. However, little is known about whether statistical learning is a primary mechanism for event segmentation. This study directly tests whether statistical learning alone is sufficient to segment continuous events. Twenty-eight 7- to 9-month-old infants…

  11. Statistical Learning as a Key to Cracking Chinese Orthographic Codes

    ERIC Educational Resources Information Center

    He, Xinjie; Tong, Xiuli

    2017-01-01

    This study examines statistical learning as a mechanism for Chinese orthographic learning among children in Grades 3-5. Using an artificial orthography, children were repeatedly exposed to positional, phonetic, and semantic regularities of radicals. Children showed statistical learning of all three regularities. Regularities' levels of consistency…

  12. Behavioral pattern identification for structural health monitoring in complex systems

    NASA Astrophysics Data System (ADS)

    Gupta, Shalabh

    Estimation of structural damage and quantification of structural integrity are critical for safe and reliable operation of human-engineered complex systems, such as electromechanical, thermofluid, and petrochemical systems. Damage due to fatigue crack is one of the most commonly encountered sources of structural degradation in mechanical systems. Early detection of fatigue damage is essential because the resulting structural degradation could potentially cause catastrophic failures, leading to loss of expensive equipment and human life. Therefore, for reliable operation and enhanced availability, it is necessary to develop capabilities for prognosis and estimation of impending failures, such as the onset of wide-spread fatigue crack damage in mechanical structures. This dissertation presents information-based online sensing of fatigue damage using the analytical tools of symbolic time series analysis ( STSA). Anomaly detection using STSA is a pattern recognition method that has been recently developed based upon a fixed-structure, fixed-order Markov chain. The analysis procedure is built upon the principles of Symbolic Dynamics, Information Theory and Statistical Pattern Recognition. The dissertation demonstrates real-time fatigue damage monitoring based on time series data of ultrasonic signals. Statistical pattern changes are measured using STSA to monitor the evolution of fatigue damage. Real-time anomaly detection is presented as a solution to the forward (analysis) problem and the inverse (synthesis) problem. (1) the forward problem - The primary objective of the forward problem is identification of the statistical changes in the time series data of ultrasonic signals due to gradual evolution of fatigue damage. (2) the inverse problem - The objective of the inverse problem is to infer the anomalies from the observed time series data in real time based on the statistical information generated during the forward problem. A computer-controlled special-purpose fatigue test apparatus, equipped with multiple sensing devices (e.g., ultrasonics and optical microscope) for damage analysis, has been used to experimentally validate the STSA method for early detection of anomalous behavior. The sensor information is integrated with a software module consisting of the STSA algorithm for real-time monitoring of fatigue damage. Experiments have been conducted under different loading conditions on specimens constructed from the ductile aluminium alloy 7075 - T6. The dissertation has also investigated the application of the STSA method for early detection of anomalies in other engineering disciplines. Two primary applications include combustion instability in a generic thermal pulse combustor model and whirling phenomenon in a typical misaligned shaft.

  13. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  14. Analysis of Immune Complex Structure by Statistical Mechanics and Light Scattering Techniques.

    NASA Astrophysics Data System (ADS)

    Busch, Nathan Adams

    1995-01-01

    The size and structure of immune complexes determine their behavior in the immune system. The chemical physics of the complex formation is not well understood; this is due in part to inadequate characterization of the proteins involved, and in part by lack of sufficiently well developed theoretical techniques. Understanding the complex formation will permit rational design of strategies for inhibiting tissue deposition of the complexes. A statistical mechanical model of the proteins based upon the theory of associating fluids was developed. The multipole electrostatic potential for each protein used in this study was characterized for net protein charge, dipole moment magnitude, and dipole moment direction. The binding sites, between the model antigen and antibodies, were characterized for their net surface area, energy, and position relative to the dipole moment of the protein. The equilibrium binding graphs generated with the protein statistical mechanical model compares favorably with experimental data obtained from radioimmunoassay results. The isothermal compressibility predicted by the model agrees with results obtained from dynamic light scattering. The statistical mechanics model was used to investigate association between the model antigen and selected pairs of antibodies. It was found that, in accordance to expectations from thermodynamic arguments, the highest total binding energy yielded complex distributions which were skewed to higher complex size. From examination of the simulated formation of ring structures from linear chain complexes, and from the joint shape probability surfaces, it was found that ring configurations were formed by the "folding" of linear chains until the ends are within binding distance. By comparing the single antigen/two antibody system which differ only in their respective binding site locations, it was found that binding site location influences complex size and shape distributions only when ring formation occurs. The internal potential energy of a ring complex is considerably less than that of the non-associating system; therefore the ring complexes are quite stable and show no evidence of breaking, and collapsing into smaller complexes. The ring formation will occur only in systems where the total free energy of each complex may be minimized. Thus, ring formation will occur even though entropically unfavorable conformations result if the total free energy can be minimized by doing so.

  15. Dragon-Kings, Black-Swans and Prediction (Invited)

    NASA Astrophysics Data System (ADS)

    Sornette, D.

    2010-12-01

    Extreme fluctuations or events are often associated with power law statistics. Indeed, it is a popular belief that "wild randomness'' is deeply associated with distributions with power law tails characterized by small exponents. In other words, power law tails are often seen as the epitome of extreme events (the "Black Swan'' story). Here, we document in very different systems that there is life beyond power law tails: power laws can be superseded by "dragon-kings'', monster events that occur beyond (or changing) the power law tail. Dragon-kings reveal hidden mechanisms that are only transiently active and that amplify the normal fluctuations (often described by the power laws of the normal regime). The goal of this lecture is to catalyze the interest of the community of geophysicists across all fields of geosciences so that the "invisible gorilla" fallacy may be avoided. Our own research illustrates that new statistics or representation of data are often necessary to identify dragon-kings, with strategies guided by the underlying mechanisms. Paradoxically, the monsters may be ignored or hidden by the use of inappropriate analysis or statistical tools that amount to cut a mamooth in small pieces, so as to lead to the incorrect belief that only mice exist. In order to stimulate further research, we will document and discuss the dragon-king phenomenon on the statistics of financial losses, economic geography, hydrodynamic turbulence, mechanical ruptures, avalanches in complex heterogeneous media, earthquakes, and epileptic seizures. The special status of dragon-kings open a new research program on their predictability, based on the fact that they belong to a different class of their own and express specific mechanisms amplifying the normal dynamics via positive feedbacks. We will present evidence of these claims for the predictions of material rupture, financial crashes and epileptic seizures. As a bonus, a few remarks will be offered at the end on how the dragon-king phenomenon allows us to understand the present World financial crisis as underpinned in two decades of successive financial and economic bubbles, inflating the mother of all bubbles with new monster dragon-kings at the horizon. The consequences in terms of a new "normal" are eye-opening. Ref: D. Sornette, Dragon-Kings, Black Swans and the Prediction of Crises, International Journal of Terraspace Science and Engineering 1(3), 1-17 (2009) (http://arXiv.org/abs/0907.4290) and (http://ssrn.com/abstract=1470006)

  16. An injury mortality prediction based on the anatomic injury scale

    PubMed Central

    Wang, Muding; Wu, Dan; Qiu, Wusi; Wang, Weimi; Zeng, Yunji; Shen, Yi

    2017-01-01

    Abstract To determine whether the injury mortality prediction (IMP) statistically outperforms the trauma mortality prediction model (TMPM) as a predictor of mortality. The TMPM is currently the best trauma score method, which is based on the anatomic injury. Its ability of mortality prediction is superior to the injury severity score (ISS) and to the new injury severity score (NISS). However, despite its statistical significance, the predictive power of TMPM needs to be further improved. Retrospective cohort study is based on the data of 1,148,359 injured patients in the National Trauma Data Bank hospitalized from 2010 to 2011. Sixty percent of the data was used to derive an empiric measure of severity of different Abbreviated Injury Scale predot codes by taking the weighted average death probabilities of trauma patients. Twenty percent of the data was used to create computing method of the IMP model. The remaining 20% of the data was used to evaluate the statistical performance of IMP and then be compared with the TMPM and the single worst injury by examining area under the receiver operating characteristic curve (ROC), the Hosmer–Lemeshow (HL) statistic, and the Akaike information criterion. IMP exhibits significantly both better discrimination (ROC-IMP, 0.903 [0.899–0.907] and ROC-TMPM, 0.890 [0.886–0.895]) and calibration (HL-IMP, 9.9 [4.4–14.7] and HL-TMPM, 197 [143–248]) compared with TMPM. All models show slight changes after the extension of age, gender, and mechanism of injury, but the extended IMP still dominated TMPM in every performance. The IMP has slight improvement in discrimination and calibration compared with the TMPM and can accurately predict mortality. Therefore, we consider it as a new feasible scoring method in trauma research. PMID:28858124

  17. Estimating the deposition of urban atmospheric NO2 to the urban forest in Portland-Vancouver USA

    NASA Astrophysics Data System (ADS)

    Rao, M.; Gonzalez Abraham, R.; George, L. A.

    2016-12-01

    Cities are hotspots of atmospheric emissions of reactive nitrogen oxides, including nitrogen dioxide (NO2), a US EPA criteria pollutant that affects both human and environmental health. A fraction of this anthropogenic, atmospheric NO2 is deposited onto the urban forest, potentially mitigating the impact of NO2 on respiratory health within cities. However, the role of the urban forest in removal of atmospheric NO2 through deposition has not been well studied. Here, using an observationally-based statistical model, we first estimate the reduction of NO2 associated with the urban forest in Portland-Vancouver, USA, and the health benefits accruing from this reduction. In order to assess if this statistically observed reduction in NO2 associated with the urban forest is consistent with deposition, we then compare the amount of NO2 removed through deposition to the urban forest as estimated using a 4km CMAQ simulation. We further undertake a sensitivity analysis in CMAQ to estimate the range of NO2removed as a function of bulk stomatal resistance. We find that NO2 deposition estimated by CMAQ accounts for roughly one-third of the reduction in NO2 shown by the observationally-based statistical model (Figure). Our sensitivity analysis shows that a 3-10 fold increase in the bulk stomatal resistance parameter in CMAQ would align CMAQ-estimated deposition with the statistical model. The reduction of NO2 by the urban forest in the Portland-Vancouver area may yield a health benefit of at least $1.5 million USD annually, providing strong motivation to better understand the mechanism through which the urban forest may be removing air pollutants such as NO2and thus helping create healthier urban atmospheres. Figure: Comparing the amount of NO2 deposition as estimated by CMAQ and the observationally-based statistical model (LURF). Each point corresponds to a single 4 x 4km CMAQ grid cell.

  18. An evaluation of GTAW-P versus GTA welding of alloy 718

    NASA Technical Reports Server (NTRS)

    Gamwell, W. R.; Kurgan, C.; Malone, T. W.

    1991-01-01

    Mechanical properties were evaluated to determine statistically whether the pulsed current gas tungsten arc welding (GTAW-P) process produces welds in alloy 718 with room temperature structural performance equivalent to current Space Shuttle Main Engine (SSME) welds manufactured by the constant current GTAW-P process. Evaluations were conducted on two base metal lots, two filler metal lots, two heat input levels, and two welding processes. The material form was 0.125-inch (3.175-mm) alloy 718 sheet. Prior to welding, sheets were treated to either the ST or STA-1 condition. After welding, panels were left as welded or heat treated to the STA-1 condition, and weld beads were left intact or machined flush. Statistical analyses were performed on yield strength, ultimate tensile strength (UTS), and high cycle fatigue (HCF) properties for all the post welded material conditions. Analyses of variance were performed on the data to determine if there were any significant effects on UTS or HCF life due to variations in base metal, filler metal, heat input level, or welding process. Statistical analyses showed that the GTAW-P process does produce welds with room temperature structural performance equivalent to current SSME welds manufactured by the GTAW process, regardless of prior material condition or post welding condition.

  19. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  20. Addressing issues associated with evaluating prediction models for survival endpoints based on the concordance statistic.

    PubMed

    Wang, Ming; Long, Qi

    2016-09-01

    Prediction models for disease risk and prognosis play an important role in biomedical research, and evaluating their predictive accuracy in the presence of censored data is of substantial interest. The standard concordance (c) statistic has been extended to provide a summary measure of predictive accuracy for survival models. Motivated by a prostate cancer study, we address several issues associated with evaluating survival prediction models based on c-statistic with a focus on estimators using the technique of inverse probability of censoring weighting (IPCW). Compared to the existing work, we provide complete results on the asymptotic properties of the IPCW estimators under the assumption of coarsening at random (CAR), and propose a sensitivity analysis under the mechanism of noncoarsening at random (NCAR). In addition, we extend the IPCW approach as well as the sensitivity analysis to high-dimensional settings. The predictive accuracy of prediction models for cancer recurrence after prostatectomy is assessed by applying the proposed approaches. We find that the estimated predictive accuracy for the models in consideration is sensitive to NCAR assumption, and thus identify the best predictive model. Finally, we further evaluate the performance of the proposed methods in both settings of low-dimensional and high-dimensional data under CAR and NCAR through simulations. © 2016, The International Biometric Society.

  1. Wave packet and statistical quantum calculations for the He + NeH⁺ → HeH⁺ + Ne reaction on the ground electronic state.

    PubMed

    Koner, Debasish; Barrios, Lizandra; González-Lezana, Tomás; Panda, Aditya N

    2014-09-21

    A real wave packet based time-dependent method and a statistical quantum method have been used to study the He + NeH(+) (v, j) reaction with the reactant in various ro-vibrational states, on a recently calculated ab initio ground state potential energy surface. Both the wave packet and statistical quantum calculations were carried out within the centrifugal sudden approximation as well as using the exact Hamiltonian. Quantum reaction probabilities exhibit dense oscillatory pattern for smaller total angular momentum values, which is a signature of resonances in a complex forming mechanism for the title reaction. Significant differences, found between exact and approximate quantum reaction cross sections, highlight the importance of inclusion of Coriolis coupling in the calculations. Statistical results are in fairly good agreement with the exact quantum results, for ground ro-vibrational states of the reactant. Vibrational excitation greatly enhances the reaction cross sections, whereas rotational excitation has relatively small effect on the reaction. The nature of the reaction cross section curves is dependent on the initial vibrational state of the reactant and is typical of a late barrier type potential energy profile.

  2. Relationship between urban eco-environment and competitiveness with the background of globalization: statistical explanation based on industry type newly classified with environment demand and environment pressure.

    PubMed

    Kang, Xiao-guang; Ma, Qing-Bin

    2005-01-01

    Within the global urban system, the statistical relationship between urban eco-environment (UE) and urban competitiveness (UC) (RUEC) is researched. Data showed that there is a statistically inverted-U relationship between UE and UC. Eco-environmental factor is put into the classification of industries, and gets six industrial types by two indexes viz. industries' eco-environmental demand and pressure. The statistical results showed that there is a strong relationship, for new industrial classification, between the changes of industrial structure and evolvement of UE. The drive mechanism of the evolvement of urban eco-environment, with human demand and global work division was analyzed. The conclusion is that the development stratege, industrial policies of cities, and environmental policies fo cities must be fit with their ranks among the global urban system. At the era of globalization, so far as the environmental policies, their rationality could not be assessed with the level of strictness, but it can enhance cities' competitiveness when they are fit with cities' capabilities to attract and control some sections of the industry's value-chain. None but these kinds of environmental policies can probably enhance the UC.

  3. An agent-based model contrasts opposite effects of dynamic and stable microtubules on cleavage furrow positioning

    PubMed Central

    Odell, Garrett M.; Foe, Victoria E.

    2008-01-01

    From experiments by Foe and von Dassow (Foe, V.E., and G. von Dassow. 2008. J. Cell Biol. 183:457–470) and others, we infer a molecular mechanism for positioning the cleavage furrow during cytokinesis. Computer simulations reveal how this mechanism depends on quantitative motor-behavior details and explore how robustly this mechanism succeeds across a range of cell sizes. The mechanism involves the MKLP1 (kinesin-6) component of centralspindlin binding to and walking along microtubules to stimulate cortical contractility where the centralspindlin complex concentrates. The majority of astral microtubules are dynamically unstable. They bind most MKLP1 and suppress cortical Rho/myosin II activation because the tips of unstable microtubules usually depolymerize before MKLP1s reach the cortex. A subset of astral microtubules stabilizes during anaphase, becoming effective rails along which MKLP1 can actually reach the cortex. Because stabilized microtubules aim statistically at the equatorial spindle midplane, that is where centralspindlin accumulates to stimulate furrow formation. PMID:18955556

  4. An agent-based model contrasts opposite effects of dynamic and stable microtubules on cleavage furrow positioning.

    PubMed

    Odell, Garrett M; Foe, Victoria E

    2008-11-03

    From experiments by Foe and von Dassow (Foe, V.E., and G. von Dassow. 2008. J. Cell Biol. 183:457-470) and others, we infer a molecular mechanism for positioning the cleavage furrow during cytokinesis. Computer simulations reveal how this mechanism depends on quantitative motor-behavior details and explore how robustly this mechanism succeeds across a range of cell sizes. The mechanism involves the MKLP1 (kinesin-6) component of centralspindlin binding to and walking along microtubules to stimulate cortical contractility where the centralspindlin complex concentrates. The majority of astral microtubules are dynamically unstable. They bind most MKLP1 and suppress cortical Rho/myosin II activation because the tips of unstable microtubules usually depolymerize before MKLP1s reach the cortex. A subset of astral microtubules stabilizes during anaphase, becoming effective rails along which MKLP1 can actually reach the cortex. Because stabilized microtubules aim statistically at the equatorial spindle midplane, that is where centralspindlin accumulates to stimulate furrow formation.

  5. Cartilaginous Metabolomic Study Reveals Potential Mechanisms of Osteophyte Formation in Osteoarthritis.

    PubMed

    Xu, Zhongwei; Chen, Tingmei; Luo, Jiao; Ding, Shijia; Gao, Sichuan; Zhang, Jian

    2017-04-07

    Osteophyte is one of the inevitable consequences of progressive osteoarthritis with the main characteristics of cartilage degeneration and endochondral ossification. The pathogenesis of osteophyte formation is not fully understood to date. In this work, metabolomic approaches were employed to explore potential mechanisms of osteophyte formation by detecting metabolic variations between extracts of osteophyte cartilage tissues (n = 32) and uninvolved control cartilage tissues (n = 34), based on the platform of ultraperformance liquid chromatography tandem quadrupole time-of-flight mass spectrometry, as well as the use of multivariate statistic analysis and univariate statistic analysis. The osteophyte group was significantly separated from the control group by the orthogonal partial least-squares discriminant analysis models, indicating that metabolic state of osteophyte cartilage had been changed. In total, 28 metabolic variations further validated by mass spectrum (MS) match, tandom mass spectrum (MS/MS) match, and standards match mainly included amino acids, sulfonic acids, glycerophospholipids, and fatty acyls. These metabolites were related to some specific physiological or pathological processes (collagen dissolution, boundary layers destroyed, self-restoration triggered, etc.) which might be associated with the procedure of osteophyte formation. Pathway analysis showed phenylalanine metabolism (PI = 0.168, p = 0.004) was highly correlative to this degenerative process. Our findings provided a direction for targeted metabolomic study and an insight into further reveal the molecular mechanisms of ostophyte formation.

  6. Statistical Physics on the Eve of the 21st Century: in Honour of J B McGuire on the Occasion of His 65th Birthday

    NASA Astrophysics Data System (ADS)

    Batchelor, Murray T.; Wille, Luc T.

    The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in the Star-Triangle Relations * A Self-Avoiding Walk Through Exactly Solved Lattice Models in Statistical Mechanics

  7. Statistical Mechanics and Dynamics of the Outer Solar System.I. The Jupiter/Saturn Zone

    NASA Technical Reports Server (NTRS)

    Grazier, K. R.; Newman, W. I.; Kaula, W. M.; Hyman, J. M.

    1996-01-01

    We report on numerical simulations designed to understand how the solar system evolved through a winnowing of planetesimals accreeted from the early solar nebula. This sorting process is driven by the energy and angular momentum and continues to the present day. We reconsider the existence and importance of stable niches in the Jupiter/Saturn Zone using greatly improved numerical techniques based on high-order optimized multi-step integration schemes coupled to roundoff error minimizing methods.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yong, E-mail: 83229994@qq.com; Ge, Hao, E-mail: haoge@pku.edu.cn; Xiong, Jie, E-mail: jiexiong@umac.mo

    Fluctuation theorem is one of the major achievements in the field of nonequilibrium statistical mechanics during the past two decades. There exist very few results for steady-state fluctuation theorem of sample entropy production rate in terms of large deviation principle for diffusion processes due to the technical difficulties. Here we give a proof for the steady-state fluctuation theorem of a diffusion process in magnetic fields, with explicit expressions of the free energy function and rate function. The proof is based on the Karhunen-Loève expansion of complex-valued Ornstein-Uhlenbeck process.

  9. Thermodynamics and statistical mechanics. [thermodynamic properties of gases

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.

  10. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  11. Mixed-order phase transition in a minimal, diffusion-based spin model.

    PubMed

    Fronczak, Agata; Fronczak, Piotr

    2016-07-01

    In this paper we exactly solve, within the grand canonical ensemble, a minimal spin model with the hybrid phase transition. We call the model diffusion based because its Hamiltonian can be recovered from a simple dynamic procedure, which can be seen as an equilibrium statistical mechanics representation of a biased random walk. We outline the derivation of the phase diagram of the model, in which the triple point has the hallmarks of the hybrid transition: discontinuity in the average magnetization and algebraically diverging susceptibilities. At this point, two second-order transition curves meet in equilibrium with the first-order curve, resulting in a prototypical mixed-order behavior.

  12. Addressing the statistical mechanics of planet orbits in the solar system

    NASA Astrophysics Data System (ADS)

    Mogavero, Federico

    2017-10-01

    The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.

  13. Progressive statistics for studies in sports medicine and exercise science.

    PubMed

    Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri

    2009-01-01

    Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.

  14. Stretchable carbon nanotube charge-trap floating-gate memory and logic devices for wearable electronics.

    PubMed

    Son, Donghee; Koo, Ja Hoon; Song, Jun-Kyul; Kim, Jaemin; Lee, Mincheol; Shim, Hyung Joon; Park, Minjoon; Lee, Minbaek; Kim, Ji Hoon; Kim, Dae-Hyeong

    2015-05-26

    Electronics for wearable applications require soft, flexible, and stretchable materials and designs to overcome the mechanical mismatch between the human body and devices. A key requirement for such wearable electronics is reliable operation with high performance and robustness during various deformations induced by motions. Here, we present materials and device design strategies for the core elements of wearable electronics, such as transistors, charge-trap floating-gate memory units, and various logic gates, with stretchable form factors. The use of semiconducting carbon nanotube networks designed for integration with charge traps and ultrathin dielectric layers meets the performance requirements as well as reliability, proven by detailed material and electrical characterizations using statistics. Serpentine interconnections and neutral mechanical plane layouts further enhance the deformability required for skin-based systems. Repetitive stretching tests and studies in mechanics corroborate the validity of the current approaches.

  15. Double-Layer Mediated Electromechanical Response of Amyloid Fibrils in Liquid Environment

    PubMed Central

    Nikiforov, M.P.; Thompson, G.L.; Reukov, V.V.; Jesse, S.; Guo, S.; Rodriguez, B.J.; Seal, K.; Vertegel, A.A.; Kalinin, S.V.

    2010-01-01

    Harnessing electrical bias-induced mechanical motion on the nanometer and molecular scale is a critical step towards understanding the fundamental mechanisms of redox processes and implementation of molecular electromechanical machines. Probing these phenomena in biomolecular systems requires electromechanical measurements be performed in liquid environments. Here we demonstrate the use of band excitation piezoresponse force microscopy for probing electromechanical coupling in amyloid fibrils. The approaches for separating the elastic and electromechanical contributions based on functional fits and multivariate statistical analysis are presented. We demonstrate that in the bulk of the fibril the electromechanical response is dominated by double-layer effects (consistent with shear piezoelectricity of biomolecules), while a number of electromechanically active hot spots possibly related to structural defects are observed. PMID:20088597

  16. Generalized Models for Rock Joint Surface Shapes

    PubMed Central

    Du, Shigui; Hu, Yunjin; Hu, Xiaofei

    2014-01-01

    Generalized models of joint surface shapes are the foundation for mechanism studies on the mechanical effects of rock joint surface shapes. Based on extensive field investigations of rock joint surface shapes, generalized models for three level shapes named macroscopic outline, surface undulating shape, and microcosmic roughness were established through statistical analyses of 20,078 rock joint surface profiles. The relative amplitude of profile curves was used as a borderline for the division of different level shapes. The study results show that the macroscopic outline has three basic features such as planar, arc-shaped, and stepped; the surface undulating shape has three basic features such as planar, undulating, and stepped; and the microcosmic roughness has two basic features such as smooth and rough. PMID:25152901

  17. Effects of different mechanized soil fertilization methods on corn soil fertility under continuous cropping

    NASA Astrophysics Data System (ADS)

    Shi, Qingwen; Wang, Huixin; Bai, Chunming; Wu, Di; Song, Qiaobo; Gao, Depeng; Dong, Zengqi; Cheng, Xin; Dong, Qiping; Zhang, Yahao; Mu, Jiahui; Chen, Qinghong; Liao, Wenqing; Qu, Tianru; Zhang, Chunling; Zhang, Xinyu; Liu, Yifei; Han, Xiaori

    2017-05-01

    Experiments for mechanized soil fertilization for corns were conducted in Faku demonstration zone. On this basis, we studied effects on corn soil fertility under continuous cropping due to different mechanized soil fertilization methods. Our study would serve as a theoretical basis further for mechanized soil fertilization improvement and soil quality improvement in brown soil area. Based on the survey of soil physical characteristics during different corn growth periods, we collected soil samples from different corn growth periods to determine and make statistical analysis accordingly. Stalk returning to field with deep tillage proved to be the most effective on available nutrient improvement for arable soil in the demonstration zone. Different mechanized soil fertilization methods were remarkably effective on total phosphorus improvement for arable soil in the demonstration zone, while less effective on total nitrogen or total potassium, and not so effective on C/N ratio in soil. Stalk returning with deep tillage was more favorable to improve content of organic matter in soil, when compared with surface application, and organic granular fertilizer more favorable when compared with decomposed cow dung for such a purpose, too.

  18. Coupling functions: Universal insights into dynamical interaction mechanisms

    NASA Astrophysics Data System (ADS)

    Stankovski, Tomislav; Pereira, Tiago; McClintock, Peter V. E.; Stefanovska, Aneta

    2017-10-01

    The dynamical systems found in nature are rarely isolated. Instead they interact and influence each other. The coupling functions that connect them contain detailed information about the functional mechanisms underlying the interactions and prescribe the physical rule specifying how an interaction occurs. A coherent and comprehensive review is presented encompassing the rapid progress made recently in the analysis, understanding, and applications of coupling functions. The basic concepts and characteristics of coupling functions are presented through demonstrative examples of different domains, revealing the mechanisms and emphasizing their multivariate nature. The theory of coupling functions is discussed through gradually increasing complexity from strong and weak interactions to globally coupled systems and networks. A variety of methods that have been developed for the detection and reconstruction of coupling functions from measured data is described. These methods are based on different statistical techniques for dynamical inference. Stemming from physics, such methods are being applied in diverse areas of science and technology, including chemistry, biology, physiology, neuroscience, social sciences, mechanics, and secure communications. This breadth of application illustrates the universality of coupling functions for studying the interaction mechanisms of coupled dynamical systems.

  19. Capturing farm diversity with hypothesis-based typologies: An innovative methodological framework for farming system typology development

    PubMed Central

    Alvarez, Stéphanie; Timler, Carl J.; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A.; Groot, Jeroen C. J.

    2018-01-01

    Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia’s Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies. PMID:29763422

  20. Capturing farm diversity with hypothesis-based typologies: An innovative methodological framework for farming system typology development.

    PubMed

    Alvarez, Stéphanie; Timler, Carl J; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A; Groot, Jeroen C J

    2018-01-01

    Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia's Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies.

  1. Pharmaceutical counselling about different types of tablet-splitting methods based on the results of weighing tests and mechanical development of splitting devices.

    PubMed

    Somogyi, O; Meskó, A; Csorba, L; Szabó, P; Zelkó, R

    2017-08-30

    The division of tablets and adequate methods of splitting them are a complex problem in all sectors of health care. Although tablet-splitting is often required, this procedure can be difficult for patients. Four tablets were investigated with different external features (shape, score-line, film-coat and size). The influencing effect of these features and the splitting methods was investigated according to the precision and "weight loss" of splitting techniques. All four types of tablets were halved by four methods: by hand, with a kitchen knife, with an original manufactured splitting device and with a modified tablet splitter based on a self-developed mechanical model. The mechanical parameters (harness and friability) of the products were measured during the study. The "weight loss" and precision of splitting methods were determined and compared by statistical analysis. On the basis of the results, the external features (geometry), the mechanical parameters of tablets and the mechanical structure of splitting devices can influence the "weight loss" and precision of tablet-splitting. Accordingly, a new decision-making scheme was developed for the selection of splitting methods. In addition, the skills of patients and the specialties of therapy should be considered so that pharmaceutical counselling can be more effective regarding tablet-splitting. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Granular statistical mechanics - a personal perspective

    NASA Astrophysics Data System (ADS)

    Blumenfeld, R.; Edwards, S. F.

    2014-10-01

    The science of granular matter has expanded from an activity for specialised engineering applications to a fundamental field in its own right. This has been accompanied by an explosion of research and literature, which cannot be reviewed in one paper. A key to progress in this field is the formulation of a statistical mechanical formalism that could help develop equations of state and constitutive relations. This paper aims at reviewing some milestones in this direction. An essential basic step toward the development of any static and quasi-static theory of granular matter is a systematic and useful method to quantify the grain-scale structure and we start with a review of such a method. We then review and discuss the ongoing attempt to construct a statistical mechanical theory of granular systems. Along the way, we will clarify a number of misconceptions in the field, as well as highlight several outstanding problems.

  3. Statistical mechanics of neocortical interactions: Path-integral evolution of short-term memory

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1994-05-01

    Previous papers in this series of statistical mechanics of neocortical interactions (SMNI) have detailed a development from the relatively microscopic scales of neurons up to the macroscopic scales as recorded by electroencephalography (EEG), requiring an intermediate mesocolumnar scale to be developed at the scale of minicolumns (~=102 neurons) and macrocolumns (~=105 neurons). Opportunity was taken to view SMNI as sets of statistical constraints, not necessarily describing specific synaptic or neuronal mechanisms, on neuronal interactions, on some aspects of short-term memory (STM), e.g., its capacity, stability, and duration. A recently developed c-language code, pathint, provides a non-Monte Carlo technique for calculating the dynamic evolution of arbitrary-dimension (subject to computer resources) nonlinear Lagrangians, such as derived for the two-variable SMNI problem. Here, pathint is used to explicitly detail the evolution of the SMNI constraints on STM.

  4. Collective behaviours: from biochemical kinetics to electronic circuits.

    PubMed

    Agliari, Elena; Barra, Adriano; Burioni, Raffaella; Di Biasio, Aldo; Uguzzoni, Guido

    2013-12-10

    In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics.

  5. Collective behaviours: from biochemical kinetics to electronic circuits

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Burioni, Raffaella; di Biasio, Aldo; Uguzzoni, Guido

    2013-12-01

    In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics.

  6. Generalising the logistic map through the q-product

    NASA Astrophysics Data System (ADS)

    Pessoa, R. W. S.; Borges, E. P.

    2011-03-01

    We investigate a generalisation of the logistic map as xn+1 = 1 - axn otimesqmap xn (-1 <= xn <= 1, 0 < a <= 2) where otimesq stands for a generalisation of the ordinary product, known as q-product [Borges, E.P. Physica A 340, 95 (2004)]. The usual product, and consequently the usual logistic map, is recovered in the limit q → 1, The tent map is also a particular case for qmap → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for qmap > 1 at the edge of chaos, particularly at the first critical point ac, that depends on the value of qmap. Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at ac(qmap), and connections with nonextensive statistical mechanics are explored.

  7. Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping.

    PubMed

    Kubica, Aleksander; Beverland, Michael E; Brandão, Fernando; Preskill, John; Svore, Krysta M

    2018-05-04

    Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p_{3DCC}^{(1)}≃1.9% and p_{3DCC}^{(2)}≃27.6%. We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.

  8. The Quantum and Fluid Mechanics of Global Warming

    NASA Astrophysics Data System (ADS)

    Marston, Brad

    2008-03-01

    Quantum physics and fluid mechanics are the foundation of any understanding of the Earth's climate. In this talk I invoke three well-known aspects of quantum mechanics to explore what will happen as the concentrations of greenhouse gases such as carbon dioxide continue to increase. Fluid dynamical models of the Earth's atmosphere, demonstrated here in live simulations, yield further insight into past, present, and future climates. Statistics of geophysical flows can, however, be ascertained directly without recourse to numerical simulation, using concepts borrowed from nonequilibrium statistical mechanicsootnotetextJ. B. Marston, E. Conover, and Tapio Schneider, ``Statistics of an Unstable Barotropic Jet from a Cumulant Expansion,'' arXiv:0705.0011, J. Atmos. Sci. (in press).. I discuss several other ways that theoretical physics may be able to contribute to a deeper understanding of climate changeootnotetextJ. Carlson, J. Harte, G. Falkovich, J. B. Marston, and R. Pierrehumbert, ``Physics of Climate Change'' 2008 Program of the Kavli Institute for Theoretical Physics..

  9. Thermodynamics-based models of transcriptional regulation with gene sequence.

    PubMed

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  10. Is There a Common Summary Statistical Process for Representing the Mean and Variance? A Study Using Illustrations of Familiar Items.

    PubMed

    Yang, Yi; Tokita, Midori; Ishiguchi, Akira

    2018-01-01

    A number of studies revealed that our visual system can extract different types of summary statistics, such as the mean and variance, from sets of items. Although the extraction of such summary statistics has been studied well in isolation, the relationship between these statistics remains unclear. In this study, we explored this issue using an individual differences approach. Observers viewed illustrations of strawberries and lollypops varying in size or orientation and performed four tasks in a within-subject design, namely mean and variance discrimination tasks with size and orientation domains. We found that the performances in the mean and variance discrimination tasks were not correlated with each other and demonstrated that extractions of the mean and variance are mediated by different representation mechanisms. In addition, we tested the relationship between performances in size and orientation domains for each summary statistic (i.e. mean and variance) and examined whether each summary statistic has distinct processes across perceptual domains. The results illustrated that statistical summary representations of size and orientation may share a common mechanism for representing the mean and possibly for representing variance. Introspections for each observer performing the tasks were also examined and discussed.

  11. Higher Order Cumulant Studies of Ocean Surface Random Fields from Satellite Altimeter Data

    NASA Technical Reports Server (NTRS)

    Cheng, B.

    1996-01-01

    Higher order statistics, especially 2nd order statistics, have been used to study ocean processes for many years in the past, and occupy an appreciable part of the research literature on physical oceanography. They in turn form part of a much larger field of study in statistical fluid mechanics.

  12. On the correlation between microscopic structural heterogeneity and embrittlement behavior in metallic glasses

    PubMed Central

    Li, Weidong; Gao, Yanfei; Bei, Hongbin

    2015-01-01

    In order to establish a relationship between microstructure and mechanical properties, we systematically annealed a Zr-based bulk metallic glass (BMG) at 100 ~ 300 °C and measured their mechanical and thermal properties. The as-cast BMG exhibits some ductility, while the increase of annealing temperature and time leads to the transition to a brittle behavior that can reach nearly-zero fracture energy. The differential scanning calorimetry did not find any significant changes in crystallization temperature and enthalpy, indicating that the materials still remained fully amorphous. Elastic constants measured by ultrasonic technique vary only slightly with respect to annealing temperature and time, which does obey the empirical relationship between Poisson’s ratio and fracture behavior. Nanoindentation pop-in tests were conducted, from which the pop-in strength mapping provides a “mechanical probe” of the microscopic structural heterogeneities in these metallic glasses. Based on stochastically statistic defect model, we found that the defect density decreases with increasing annealing temperature and annealing time and is exponentially related to the fracture energy. A ductile-versus-brittle behavior (DBB) model based on the structural heterogeneity is developed to identify the physical origins of the embrittlement behavior through the interactions between these defects and crack tip. PMID:26435318

  13. The comparison of manual and LabVIEW-based fuzzy control on mechanical ventilation.

    PubMed

    Guler, Hasan; Ata, Fikret

    2014-09-01

    The aim of this article is to develop a knowledge-based therapy for management of rats with respiratory distress. A mechanical ventilator was designed to achieve this aim. The designed ventilator is called an intelligent mechanical ventilator since fuzzy logic was used to control the pneumatic equipment according to the rat's status. LabVIEW software was used to control all equipments in the ventilator prototype and to monitor respiratory variables in the experiment. The designed ventilator can be controlled both manually and by fuzzy logic. Eight female Wistar-Albino rats were used to test the designed ventilator and to show the effectiveness of fuzzy control over manual control on pressure control ventilation mode. The anesthetized rats were first ventilated for 20 min manually. After that time, they were ventilated for 20 min by fuzzy logic. Student's t-test for p < 0.05 was applied to the measured minimum, maximum and mean peak inspiration pressures to analyze the obtained results. The results show that there is no statistical difference in the rat's lung parameters before and after the experiments. It can be said that the designed ventilator and developed knowledge-based therapy support artificial respiration of living things successfully. © IMechE 2014.

  14. On the correlation between microscopic structural heterogeneity and embrittlement behavior in metallic glasses

    DOE PAGES

    Li, Weidong; Gao, Yanfei; Bei, Hongbin

    2015-10-05

    To establish a relationship between microstructure and mechanical properties, we systematically annealed a Zr-based bulk metallic glass (BMG) at 100 ~ 300°C and measured their mechanical and thermal properties. The as-cast BMG exhibits some ductility, while the increase of annealing temperature and time leads to the transition to a brittle behavior that can reach nearly-zero fracture energy. The differential scanning calorimetry did not find any significant changes in crystallization temperature and enthalpy, indicating that the materials still remained fully amorphous. Elastic constants measured by ultrasonic technique vary only slightly with respect to annealing temperature and time, which does obey themore » empirical relationship between Poisson’s ratio and fracture behavior. Nanoindentation pop-in tests were conducted, from which the pop-in strength mapping provides a “mechanical probe” of the microscopic structural heterogeneities in these metallic glasses. Based on stochastically statistic defect model, we found that the defect density decreases with increasing annealing temperature and annealing time and is exponentially related to the fracture energy. A ductile-versus-brittle behavior (DBB) model based on the structural heterogeneity is developed to identify the physical origins of the embrittlement behavior through the interactions between these defects and crack tip.« less

  15. A statistical framework for biomedical literature mining.

    PubMed

    Chung, Dongjun; Lawson, Andrew; Zheng, W Jim

    2017-09-30

    In systems biology, it is of great interest to identify new genes that were not previously reported to be associated with biological pathways related to various functions and diseases. Identification of these new pathway-modulating genes does not only promote understanding of pathway regulation mechanisms but also allow identification of novel targets for therapeutics. Recently, biomedical literature has been considered as a valuable resource to investigate pathway-modulating genes. While the majority of currently available approaches are based on the co-occurrence of genes within an abstract, it has been reported that these approaches show only sub-optimal performances because 70% of abstracts contain information only for a single gene. To overcome such limitation, we propose a novel statistical framework based on the concept of ontology fingerprint that uses gene ontology to extract information from large biomedical literature data. The proposed framework simultaneously identifies pathway-modulating genes and facilitates interpreting functions of these new genes. We also propose a computationally efficient posterior inference procedure based on Metropolis-Hastings within Gibbs sampler for parameter updates and the poor man's reversible jump Markov chain Monte Carlo approach for model selection. We evaluate the proposed statistical framework with simulation studies, experimental validation, and an application to studies of pathway-modulating genes in yeast. The R implementation of the proposed model is currently available at https://dongjunchung.github.io/bayesGO/. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Substorm associated radar auroral surges: a statistical study and possible generation model

    NASA Astrophysics Data System (ADS)

    Shand, B. A.; Lester, M.; Yeoman, T. K.

    1998-04-01

    Substorm-associated radar auroral surges (SARAS) are a short lived (15-90 minutes) and spatially localised (~5° of latitude) perturbation of the plasma convection pattern observed within the auroral E-region. The understanding of such phenomena has important ramifications for the investigation of the larger scale plasma convection and ultimately the coupling of the solar wind, magnetosphere and ionosphere system. A statistical investigation is undertaken of SARAS, observed by the Sweden And Britain Radar Experiment (SABRE), in order to provide a more extensive examination of the local time occurrence and propagation characteristics of the events. The statistical analysis has determined a local time occurrence of observations between 1420 MLT and 2200 MLT with a maximum occurrence centred around 1700 MLT. The propagation velocity of the SARAS feature through the SABRE field of view was found to be predominately L-shell aligned with a velocity centred around 1750 m s-1 and within the range 500 m s-1 and 3500 m s-1. This comprehensive examination of the SARAS provides the opportunity to discuss, qualitatively, a possible generation mechanism for SARAS based on a proposed model for the production of a similar phenomenon referred to as sub-auroral ion drifts (SAIDs). The results of the comparison suggests that SARAS may result from a similar geophysical mechanism to that which produces SAID events, but probably occurs at a different time in the evolution of the event.

  17. Dynamic and thermodynamic processes driving the January 2014 precipitation record in southern UK

    NASA Astrophysics Data System (ADS)

    Oueslati, B.; Yiou, P.; Jezequel, A.

    2017-12-01

    Regional extreme precipitation are projected to intensify as a response to planetary climate change, with important impacts on societies. Understanding and anticipating those events remain a major challenge. In this study, we revisit the mechanisms of winter precipitation record that occurred in southern United Kingdom in January 2014. The physical drivers of this event are analyzed using the water vapor budget. Precipitation changes are decomposed into dynamic contributions, related to changes in atmospheric circulation, and thermodynamic contributions, related to changes in water vapor. We attempt to quantify the relative importance of the two contributions during this event and examine the applicability of Clausius-Clapeyron scaling. This work provides a physical interpretation of the mechanisms associated with Southern UK's wettest event, which is complementary to other studies based on statistical approaches (Schaller et al., 2016, Yiou et al., 2017). The analysis is carried out using the ERA-Interim reanalysis. This is motivated by the horizontal resolution of this dataset. It is then applied to present-day simulations and future projections of CMIP5 models on selected extreme precipitation events in southern UK that are comparable to January 2014 in terms of atmospheric circulation.References:Schaller, N. et al. Human influence on climate in the 2014 southern England winter floods and their impacts, Nature Clim. Change, 2016, 6, 627-634 Yiou, P., et al. A statistical framework for conditional extreme event attribution Advances in Statistical Climatology, Meteorology and Oceanography, 2017, 3, 17-31

  18. Evolutionary model selection and parameter estimation for protein-protein interaction network based on differential evolution algorithm

    PubMed Central

    Huang, Lei; Liao, Li; Wu, Cathy H.

    2016-01-01

    Revealing the underlying evolutionary mechanism plays an important role in understanding protein interaction networks in the cell. While many evolutionary models have been proposed, the problem about applying these models to real network data, especially for differentiating which model can better describe evolutionary process for the observed network urgently remains as a challenge. The traditional way is to use a model with presumed parameters to generate a network, and then evaluate the fitness by summary statistics, which however cannot capture the complete network structures information and estimate parameter distribution. In this work we developed a novel method based on Approximate Bayesian Computation and modified Differential Evolution (ABC-DEP) that is capable of conducting model selection and parameter estimation simultaneously and detecting the underlying evolutionary mechanisms more accurately. We tested our method for its power in differentiating models and estimating parameters on the simulated data and found significant improvement in performance benchmark, as compared with a previous method. We further applied our method to real data of protein interaction networks in human and yeast. Our results show Duplication Attachment model as the predominant evolutionary mechanism for human PPI networks and Scale-Free model as the predominant mechanism for yeast PPI networks. PMID:26357273

  19. Robust PRNG based on homogeneously distributed chaotic dynamics

    NASA Astrophysics Data System (ADS)

    Garasym, Oleg; Lozi, René; Taralova, Ina

    2016-02-01

    This paper is devoted to the design of new chaotic Pseudo Random Number Generator (CPRNG). Exploring several topologies of network of 1-D coupled chaotic mapping, we focus first on two dimensional networks. Two topologically coupled maps are studied: TTL rc non-alternate, and TTL SC alternate. The primary idea of the novel maps has been based on an original coupling of the tent and logistic maps to achieve excellent random properties and homogeneous /uniform/ density in the phase plane, thus guaranteeing maximum security when used for chaos base cryptography. In this aim two new nonlinear CPRNG: MTTL 2 sc and NTTL 2 are proposed. The maps successfully passed numerous statistical, graphical and numerical tests, due to proposed ring coupling and injection mechanisms.

  20. Statistical mechanics of the Huxley-Simmons model

    NASA Astrophysics Data System (ADS)

    Caruel, M.; Truskinovsky, L.

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  1. Test particle propagation in magnetostatic turbulence. 2: The local approximation method

    NASA Technical Reports Server (NTRS)

    Klimas, A. J.; Sandri, G.; Scudder, J. D.; Howell, D. R.

    1976-01-01

    An approximation method for statistical mechanics is presented and applied to a class of problems which contains a test particle propagation problem. All of the available basic equations used in statistical mechanics are cast in the form of a single equation which is integrodifferential in time and which is then used as the starting point for the construction of the local approximation method. Simplification of the integrodifferential equation is achieved through approximation to the Laplace transform of its kernel. The approximation is valid near the origin in the Laplace space and is based on the assumption of small Laplace variable. No other small parameter is necessary for the construction of this approximation method. The n'th level of approximation is constructed formally, and the first five levels of approximation are calculated explicitly. It is shown that each level of approximation is governed by an inhomogeneous partial differential equation in time with time independent operator coefficients. The order in time of these partial differential equations is found to increase as n does. At n = 0 the most local first order partial differential equation which governs the Markovian limit is regained.

  2. Empirical study of alginate impression materials by customized proportioning system

    PubMed Central

    2016-01-01

    PURPOSE Alginate mixers available in the market do not have the automatic proportioning unit. In this study, an automatic proportioning unit for the alginate mixer and controller software were designed and produced for a new automatic proportioning unit. With this device, it was ensured that proportioning operation could arrange weight-based alginate impression materials. MATERIALS AND METHODS The variation of coefficient in the tested groups was compared with the manual proportioning. Compression tension and tear tests were conducted to determine the mechanical properties of alginate impression materials. The experimental data were statistically analyzed using one way ANOVA and Tukey test at the 0.05 level of significance. RESULTS No statistically significant differences in modulus of elastisity (P>0.3), tensional/compresional strength (P>0.3), resilience (P>0.2), strain in failure (P>0.4), and tear energy (P>0.7) of alginate impression materials were seen. However, a decrease in the standard deviation of tested groups was observed when the customized machine was used. To verify the efficiency of the system, powder and powder/water mixing were weighed and significant decrease was observed. CONCLUSION It was possible to obtain more mechanically stable alginate impression materials by using the custom-made proportioning unit. PMID:27826387

  3. Perception of Sentence Stress in Speech Correlates With the Temporal Unpredictability of Prosodic Features.

    PubMed

    Kakouros, Sofoklis; Räsänen, Okko

    2016-09-01

    Numerous studies have examined the acoustic correlates of sentential stress and its underlying linguistic functionality. However, the mechanism that connects stress cues to the listener's attentional processing has remained unclear. Also, the learnability versus innateness of stress perception has not been widely discussed. In this work, we introduce a novel perspective to the study of sentential stress and put forward the hypothesis that perceived sentence stress in speech is related to the unpredictability of prosodic features, thereby capturing the attention of the listener. As predictability is based on the statistical structure of the speech input, the hypothesis also suggests that stress perception is a result of general statistical learning mechanisms. To study this idea, computational simulations are performed where temporal prosodic trajectories are modeled with an n-gram model. Probabilities of the feature trajectories are subsequently evaluated on a set of novel utterances and compared to human perception of stress. The results show that the low-probability regions of F0 and energy trajectories are strongly correlated with stress perception, giving support to the idea that attention and unpredictability of sensory stimulus are mutually connected. Copyright © 2015 Cognitive Science Society, Inc.

  4. Dynamic triggering of low magnitude earthquakes in the Middle American Subduction Zone

    NASA Astrophysics Data System (ADS)

    Escudero, C. R.; Velasco, A. A.

    2010-12-01

    We analyze global and Middle American Subduction Zone (MASZ) seismicity from 1998 to 2008 to quantify the transient stresses effects at teleseismic distances. We use the Bulletin of the International Seismological Centre Catalog (ISCCD) published by the Incorporated Research Institutions for Seismology (IRIS). To identify MASZ seismicity changes due to distant, large (Mw >7) earthquakes, we first identify local earthquakes that occurred before and after the mainshocks. We then group the local earthquakes within a cluster radius between 75 to 200 km. We obtain statistics based on characteristics of both mainshocks and local earthquakes clusters, such as local cluster-mainshock azimuth, mainshock focal mechanism, and local earthquakes clusters within the MASZ. Due to lateral variations of the dip along the subducted oceanic plate, we divide the Mexican subduction zone in four segments. We then apply the Paired Samples Statistical Test (PSST) to the sorted data to identify increment, decrement or either in the local seismicity associated with distant large earthquakes. We identify dynamic triggering for all MASZ segments produced by large earthquakes emerging from specific azimuths, as well as, a decrease for some cases. We find no depend of seismicity changes due to focal mainshock mechanism.

  5. Statistical model selection for better prediction and discovering science mechanisms that affect reliability

    DOE PAGES

    Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.

    2015-08-19

    Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemore » inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.« less

  6. A probabilistic mechanical model for prediction of aggregates’ size distribution effect on concrete compressive strength

    NASA Astrophysics Data System (ADS)

    Miled, Karim; Limam, Oualid; Sab, Karam

    2012-06-01

    To predict aggregates' size distribution effect on the concrete compressive strength, a probabilistic mechanical model is proposed. Within this model, a Voronoi tessellation of a set of non-overlapping and rigid spherical aggregates is used to describe the concrete microstructure. Moreover, aggregates' diameters are defined as statistical variables and their size distribution function is identified to the experimental sieve curve. Then, an inter-aggregate failure criterion is proposed to describe the compressive-shear crushing of the hardened cement paste when concrete is subjected to uniaxial compression. Using a homogenization approach based on statistical homogenization and on geometrical simplifications, an analytical formula predicting the concrete compressive strength is obtained. This formula highlights the effects of cement paste strength and aggregates' size distribution and volume fraction on the concrete compressive strength. According to the proposed model, increasing the concrete strength for the same cement paste and the same aggregates' volume fraction is obtained by decreasing both aggregates' maximum size and the percentage of coarse aggregates. Finally, the validity of the model has been discussed through a comparison with experimental results (15 concrete compressive strengths ranging between 46 and 106 MPa) taken from literature and showing a good agreement with the model predictions.

  7. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition.

    PubMed

    Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2014-06-05

    In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.

  8. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition

    PubMed Central

    2014-01-01

    Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483

  9. An analysis of a large dataset on immigrant integration in Spain. The Statistical Mechanics perspective on Social Action

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia

    2014-02-01

    How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.

  10. Spectroscopy of Single AlInAs Quantum Dots

    NASA Astrophysics Data System (ADS)

    Derebezov, I. A.; Gaisler, A. V.; Gaisler, V. A.; Dmitriev, D. V.; Toropov, A. I.; Kozhukhov, A. S.; Shcheglov, D. V.; Latyshev, A. V.; Aseev, A. L.

    2018-03-01

    A system of quantum dots based on Al x In1- x As/Al y Ga1- y As solid solutions is investigated. The use of Al x In1- x As wide-gap solid solutions as the basis of quantum dots substantially extends the spectral emission range to the short-wavelength region, including the wavelength region near 770 nm, which is of interest for the development of aerospace systems of quantum cryptography. The optical characteristics of Al x In1- x As single quantum dots grown by the Stranski-Krastanov mechanism were studied by cryogenic microphotoluminescence. The statistics of the emission of single quantum dot excitons was studied using a Hanbury Brown-Twiss interferometer. The pair photon correlation function indicates the sub-Poissonian nature of the emission statistics, which directly confirms the possibility of developing single-photon emitters based on Al x In1- x As quantum dots. The fine structure of quantum dot exciton states was investigated at wavelengths near 770 nm. The splitting of the exciton states is found to be similar to the natural width of exciton lines, which is of great interest for the development of entangled photon pair emitters based on Al x In1- x As quantum dots.

  11. graph-GPA: A graphical model for prioritizing GWAS results and investigating pleiotropic architecture.

    PubMed

    Chung, Dongjun; Kim, Hang J; Zhao, Hongyu

    2017-02-01

    Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.

  12. Impact of implementation choices on quantitative predictions of cell-based computational models

    NASA Astrophysics Data System (ADS)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  13. Carbon Ion-Irradiated Hepatoma Cells Exhibit Coupling Interplay between Apoptotic Signaling and Morphological and Mechanical Remodeling

    PubMed Central

    Zhang, Baoping; Li, Long; Li, Zhiqiang; Liu, Yang; Zhang, Hong; Wang, Jizeng

    2016-01-01

    A apoptotic model was established based on the results of five hepatocellular carcinoma cell (HCC) lines irradiated with carbon ions to investigate the coupling interplay between apoptotic signaling and morphological and mechanical cellular remodeling. The expression levels of key apoptotic proteins and the changes in morphological characteristics and mechanical properties were systematically examined in the irradiated HCC lines. We observed that caspase-3 was activated and that the Bax/Bcl-2 ratio was significantly increased over time. Cellular morphology and mechanics analyses indicated monotonic decreases in spatial sizes, an increase in surface roughness, a considerable reduction in stiffness, and disassembly of the cytoskeletal architecture. A theoretical model of apoptosis revealed that mechanical changes in cells induce the characteristic cellular budding of apoptotic bodies. Statistical analysis indicated that the projected area, stiffness, and cytoskeletal density of the irradiated cells were positively correlated, whereas stiffness and caspase-3 expression were negatively correlated, suggesting a tight coupling interplay between the cellular structures, mechanical properties, and apoptotic protein levels. These results help to clarify a novel arbitration mechanism of cellular demise induced by carbon ions. This biomechanics strategy for evaluating apoptosis contributes to our understanding of cancer-killing mechanisms in the context of carbon ion radiotherapy. PMID:27731354

  14. Statistical Mechanical Foundation for the Two-State Transition in Protein Folding of Small Globular Proteins

    NASA Astrophysics Data System (ADS)

    Iguchi, Kazumoto

    We discuss the statistical mechanical foundation for the two-state transition in the protein folding of small globular proteins. In the standard arguments of protein folding, the statistical search for the ground state is carried out from astronomically many conformations in the configuration space. This leads us to the famous Levinthal's paradox. To resolve the paradox, Gō first postulated that the two-state transition - all-or-none type transition - is very crucial for the protein folding of small globular proteins and used the Gō's lattice model to show the two-state transition nature. Recently, there have been accumulated many experimental results that support the two-state transition for small globular proteins. Stimulated by such recent experiments, Zwanzig has introduced a minimal statistical mechanical model that exhibits the two-state transition. Also, Finkelstein and coworkers have discussed the solution of the paradox by considering the sequential folding of a small globular protein. On the other hand, recently Iguchi have introduced a toy model of protein folding using the Rubik's magic snake model, in which all folded structures are exactly known and mathematically represented in terms of the four types of conformations: cis-, trans-, left and right gauche-configurations between the unit polyhedrons. In this paper, we study the relationship between the Gō's two-state transition, the Zwanzig's statistical mechanics model and the Finkelsteinapos;s sequential folding model by applying them to the Rubik's magic snake models. We show that the foundation of the Gō's two-state transition model relies on the search within the equienergy surface that is labeled by the contact order of the hydrophobic condensation. This idea reproduces the Zwanzig's statistical model as a special case, realizes the Finkelstein's sequential folding model and fits together to understand the nature of the two-state transition of a small globular protein by calculating the physical quantities such as the free energy, the contact order and the specific heat. We point out the similarity between the liquid-gas transition in statistical mechanics and the two-state transition of protein folding. We also study morphology of the Rubik's magic snake models to give a prototype model for understanding the differences between α-helices proteins and β-sheets proteins.

  15. Flow and clogging of a sheep herd passing through a bottleneck.

    PubMed

    Garcimartín, A; Pastor, J M; Ferrer, L M; Ramos, J J; Martín-Gómez, C; Zuriguel, I

    2015-02-01

    We present an experimental study of a flock passing through a narrow door. Video monitoring of daily routines in a farm has enabled us to collect a sizable amount of data. By measuring the time lapse between the passage of consecutive animals, some features of the flow regime can be assessed. A quantitative definition of clogging is demonstrated based on the passage time statistics. These display broad tails, which can be fitted by power laws with a relatively large exponent. On the other hand, the distribution of burst sizes robustly evidences exponential behavior. Finally, borrowing concepts from granular physics and statistical mechanics, we evaluate the effect of increasing the door size and the performance of an obstacle placed in front of it. The success of these techniques opens new possibilities regarding their eventual extension to the management of human crowds.

  16. Scenarios for Evolving Seismic Crises: Possible Communication Strategies

    NASA Astrophysics Data System (ADS)

    Steacy, S.

    2015-12-01

    Recent advances in operational earthquake forecasting mean that we are very close to being able to confidently compute changes in earthquake probability as seismic crises develop. For instance, we now have statistical models such as ETAS and STEP which demonstrate considerable skill in forecasting earthquake rates and recent advances in Coulomb based models are also showing much promise. Communicating changes in earthquake probability is likely be very difficult, however, as the absolute probability of a damaging event is likely to remain quite small despite a significant increase in the relative value. Here, we use a hybrid Coulomb/statistical model to compute probability changes for a series of earthquake scenarios in New Zealand. We discuss the strengths and limitations of the forecasts and suggest a number of possible mechanisms that might be used to communicate results in an actual developing seismic crisis.

  17. Estimation of macroscopic elastic characteristics for hierarchical anisotropic solids based on probabilistic approach

    NASA Astrophysics Data System (ADS)

    Smolina, Irina Yu.

    2015-10-01

    Mechanical properties of a cable are of great importance in design and strength calculation of flexible cables. The problem of determination of elastic properties and rigidity characteristics of a cable modeled by anisotropic helical elastic rod is considered. These characteristics are calculated indirectly by means of the parameters received from statistical processing of experimental data. These parameters are considered as random quantities. With taking into account probable nature of these parameters the formulas for estimation of the macroscopic elastic moduli of a cable are obtained. The calculating expressions for macroscopic flexural rigidity, shear rigidity and torsion rigidity using the macroscopic elastic characteristics obtained before are presented. Statistical estimations of the rigidity characteristics of some cable grades are adduced. A comparison with those characteristics received on the basis of deterministic approach is given.

  18. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  19. Flow and clogging of a sheep herd passing through a bottleneck

    NASA Astrophysics Data System (ADS)

    Garcimartín, A.; Pastor, J. M.; Ferrer, L. M.; Ramos, J. J.; Martín-Gómez, C.; Zuriguel, I.

    2015-02-01

    We present an experimental study of a flock passing through a narrow door. Video monitoring of daily routines in a farm has enabled us to collect a sizable amount of data. By measuring the time lapse between the passage of consecutive animals, some features of the flow regime can be assessed. A quantitative definition of clogging is demonstrated based on the passage time statistics. These display broad tails, which can be fitted by power laws with a relatively large exponent. On the other hand, the distribution of burst sizes robustly evidences exponential behavior. Finally, borrowing concepts from granular physics and statistical mechanics, we evaluate the effect of increasing the door size and the performance of an obstacle placed in front of it. The success of these techniques opens new possibilities regarding their eventual extension to the management of human crowds.

  20. How might health services capture patient-reported safety concerns in a hospital setting? An exploratory pilot study of three mechanisms.

    PubMed

    O'Hara, Jane Kathryn; Armitage, Gerry; Reynolds, Caroline; Coulson, Claire; Thorp, Liz; Din, Ikhlaq; Watt, Ian; Wright, John

    2017-01-01

    Emergent evidence suggests that patients can identify and report safety issues while in hospital. However, little is known about the best method for collecting information from patients about safety concerns. This study presents an exploratory pilot of three mechanisms for collecting data on safety concerns from patients during their hospital stay. Three mechanisms for capturing safety concerns were coproduced with healthcare professionals and patients, before being tested in an exploratory trial using cluster randomisation at the ward level. Nine wards participated, with each mechanism being tested over a 3-month study period. Patients were asked to feed back safety concerns via the mechanism on their ward (interviewing at their bedside, paper-based form or patient safety 'hotline'). Safety concerns were subjected to a two-stage review process to identify those that would meet the definition of a patient safety incident. Differences between mechanisms on a range of outcomes were analysed using inferential statistics. Safety concerns were thematically analysed to develop reporting categories. 178 patients were recruited. Patients in the face-to-face interviewing condition provided significantly more safety concerns per patient (1.91) compared with the paper-based form (0.92) and the patient safety hotline (0.43). They were also significantly more likely to report one or more concerns, with 64% reporting via the face-to-face mechanism, compared with 41% via the paper-based form and 19% via the patient safety hotline. No mechanism differed significantly in the number of classified patient safety incidents or physician-rated preventability and severity. Interviewing at the patient's bedside is likely to be the most effective means of gathering safety concerns from inpatients, potentially providing an opportunity for health services to gather patient feedback about safety from their perspective. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Hydro-mechanical mechanism and thresholds of rainfall-induced unsaturated landslides

    NASA Astrophysics Data System (ADS)

    Yang, Zongji; Lei, Xiaoqin; Huang, Dong; Qiao, Jianping

    2017-04-01

    The devastating Ms 8 Wenchuan earthquake in 2008 created the greatest number of co-seismic mountain hazards ever recorded in China. However, the dynamics of rainfall induced mass remobilization and transport deposits after giant earthquake are not fully understood. Moreover, rainfall intensity and duration (I-D) methods are the predominant early warning indicators of rainfall-induced landslides in post-earthquake region, which are a convenient and straight-forward way to predict the hazards. However, the rainfall-based criteria and thresholds are generally empirical and based on statistical analysis,consequently, they ignore the failure mechanisms of the landslides. This study examines the mechanism and hydro-mechanical behavior and thresholds of these unsaturated deposits under the influence of rainfall. To accomplish this, in situ experiments were performed in an instrumented landslide deposit, The field experimental tests were conducted on a natural co-seismic fractured slope to 1) simulate rainfall-induced shallow failures in the depression channels of a debris flow catchment in an earthquake-affected region, 2)explore the mechanisms and transient processes associated with hydro-mechanical parameter variations in response to the infiltration of rainfall, and 3) identify the hydrologic parameter thresholds and critical criteria of gravitational erosion in areas prone to mass remobilization as a source of debris flows. These experiments provided instrumental evidence and directly proved that post-earthquake rainfall-induced mass remobilization occurred under unsaturated conditions in response to transient rainfall infiltration, and revealed the presence of transient processes and the dominance of preferential flow paths during rainfall infiltration. A hydro-mechanical method was adopted for the transient hydrologic process modelling and unsaturated slope stability analysis. and the slope failures during the experimental test were reproduced by the model, indicating that the decrease in matrix suction and increase in moisture content in response to rainfall infiltration contributed greatly to post-earthquake shallow mass movement. Thus, a threshold model for the initiation of mass remobilization is proposed based on correlations between slope stability and volumetric water content and matrix suction As a complement to rainfall-based early warning strategies, the water content and suction threshold models based on the water infiltration induced slope failure mechanism. the proposed method are expected to improve the accuracy of prediction and early warnings of post-earthquake mountain hazards

  2. Physical Activity for Cognitive and Mental Health in Youth: A Systematic Review of Mechanisms.

    PubMed

    Lubans, David; Richards, Justin; Hillman, Charles; Faulkner, Guy; Beauchamp, Mark; Nilsson, Michael; Kelly, Paul; Smith, Jordan; Raine, Lauren; Biddle, Stuart

    2016-09-01

    Physical activity can improve cognitive and mental health, but the underlying mechanisms have not been established. To present a conceptual model explaining the mechanisms for the effect of physical activity on cognitive and mental health in young people and to conduct a systematic review of the evidence. Six electronic databases (PubMed, PsycINFO, SCOPUS, Ovid Medline, SportDiscus, and Embase) were used. School-, home-, or community-based physical activity intervention or laboratory-based exercise interventions were assessed. Studies were eligible if they reported statistical analyses of changes in the following: (1) cognition or mental health; and (2) neurobiological, psychosocial, and behavioral mechanisms. Data relating to methods, assessment period, participant characteristics, intervention type, setting, and facilitator/delivery were extracted. Twenty-five articles reporting results from 22 studies were included. Mechanisms studied were neurobiological (6 studies), psychosocial (18 studies), and behavioral (2 studies). Significant changes in at least 1 potential neurobiological mechanism were reported in 5 studies, and significant effects for at least 1 cognitive outcome were also found in 5 studies. One of 2 studies reported a significant effect for self-regulation, but neither study reported a significant impact on mental health. Small number of studies and high levels of study heterogeneity. The strongest evidence was found for improvements in physical self-perceptions, which accompanied enhanced self-esteem in the majority of studies measuring these outcomes. Few studies examined neurobiological and behavioral mechanisms, and we were unable to draw conclusions regarding their role in enhancing cognitive and mental health. Copyright © 2016 by the American Academy of Pediatrics.

  3. Evaluation of Surface Roughness and Tensile Strength of Base Metal Alloys Used for Crown and Bridge on Recasting (Recycling).

    PubMed

    Agrawal, Amit; Hashmi, Syed W; Rao, Yogesh; Garg, Akanksha

    2015-07-01

    Dental casting alloys play a prominent role in the restoration of the partial dentition. Casting alloys have to survive long term in the mouth and also have the combination of structure, molecules, wear resistance and biologic compatibility. According to ADA system casting alloys were divided into three groups (wt%); high noble, Noble and predominantly base metal alloys. To evaluate the mechanical properties such as tensile strength and surface roughness of the new and recast base metal (nickel-chromium) alloys. Recasting of the base metal alloys derived from sprue and button, to make it reusable has been done. A total of 200 test specimens were fabricated using specially fabricated jig of metal and divided into two groups- 100 specimens of new alloy and 100 specimens of recast alloys, which were tested for tensile strength on universal testing machine and surface roughness on surface roughness tester. Tensile strength of new alloy showed no statistically significant difference (p-value>0.05) from recast alloy whereas new alloy had statistically significant surface roughness (Maximum and Average surface roughness) difference (p-value<0.01) as compared to recast alloy. Within the limitations of the study it is concluded that the tensile strength will not be affected by recasting of nickel-chromium alloy whereas surface roughness increases markedly.

  4. Extending (Q)SARs to incorporate proprietary knowledge for regulatory purposes: A case study using aromatic amine mutagenicity.

    PubMed

    Ahlberg, Ernst; Amberg, Alexander; Beilke, Lisa D; Bower, David; Cross, Kevin P; Custer, Laura; Ford, Kevin A; Van Gompel, Jacky; Harvey, James; Honma, Masamitsu; Jolly, Robert; Joossens, Elisabeth; Kemper, Raymond A; Kenyon, Michelle; Kruhlak, Naomi; Kuhnke, Lara; Leavitt, Penny; Naven, Russell; Neilan, Claire; Quigley, Donald P; Shuey, Dana; Spirkl, Hans-Peter; Stavitskaya, Lidiya; Teasdale, Andrew; White, Angela; Wichard, Joerg; Zwickl, Craig; Myatt, Glenn J

    2016-06-01

    Statistical-based and expert rule-based models built using public domain mutagenicity knowledge and data are routinely used for computational (Q)SAR assessments of pharmaceutical impurities in line with the approach recommended in the ICH M7 guideline. Knowledge from proprietary corporate mutagenicity databases could be used to increase the predictive performance for selected chemical classes as well as expand the applicability domain of these (Q)SAR models. This paper outlines a mechanism for sharing knowledge without the release of proprietary data. Primary aromatic amine mutagenicity was selected as a case study because this chemical class is often encountered in pharmaceutical impurity analysis and mutagenicity of aromatic amines is currently difficult to predict. As part of this analysis, a series of aromatic amine substructures were defined and the number of mutagenic and non-mutagenic examples for each chemical substructure calculated across a series of public and proprietary mutagenicity databases. This information was pooled across all sources to identify structural classes that activate or deactivate aromatic amine mutagenicity. This structure activity knowledge, in combination with newly released primary aromatic amine data, was incorporated into Leadscope's expert rule-based and statistical-based (Q)SAR models where increased predictive performance was demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. 1H NMR-Based Metabolomic Analysis of Sub-Lethal Perfluorooctane Sulfonate Exposure to the Earthworm, Eisenia fetida, in Soil

    PubMed Central

    Lankadurai, Brian P.; Furdui, Vasile I.; Reiner, Eric J.; Simpson, André J.; Simpson, Myrna J.

    2013-01-01

    1H NMR-based metabolomics was used to measure the response of Eisenia fetida earthworms after exposure to sub-lethal concentrations of perfluorooctane sulfonate (PFOS) in soil. Earthworms were exposed to a range of PFOS concentrations (five, 10, 25, 50, 100 or 150 mg/kg) for two, seven and fourteen days. Earthworm tissues were extracted and analyzed by 1H NMR. Multivariate statistical analysis of the metabolic response of E. fetida to PFOS exposure identified time-dependent responses that were comprised of two separate modes of action: a non-polar narcosis type mechanism after two days of exposure and increased fatty acid oxidation after seven and fourteen days of exposure. Univariate statistical analysis revealed that 2-hexyl-5-ethyl-3-furansulfonate (HEFS), betaine, leucine, arginine, glutamate, maltose and ATP are potential indicators of PFOS exposure, as the concentrations of these metabolites fluctuated significantly. Overall, NMR-based metabolomic analysis suggests elevated fatty acid oxidation, disruption in energy metabolism and biological membrane structure and a possible interruption of ATP synthesis. These conclusions obtained from analysis of the metabolic profile in response to sub-lethal PFOS exposure indicates that NMR-based metabolomics is an excellent discovery tool when the mode of action (MOA) of contaminants is not clearly defined. PMID:24958147

  6. STEP-TRAMM - A modeling interface for simulating localized rainfall induced shallow landslides and debris flow runout pathways

    NASA Astrophysics Data System (ADS)

    von Ruette, Jonas; Lehmann, Peter; Fan, Linfeng; Bickel, Samuel; Or, Dani

    2017-04-01

    Landslides and subsequent debris-flows initiated by rainfall represent a ubiquitous natural hazard in steep mountainous regions. We integrated a landslide hydro-mechanical triggering model and associated debris flow runout pathways with a graphical user interface (GUI) to represent these natural hazards in a wide range of catchments over the globe. The STEP-TRAMM GUI provides process-based locations and sizes of landslides patterns using digital elevation models (DEM) from SRTM database (30 m resolution) linked with soil maps from global database SoilGrids (250 m resolution) and satellite based information on rainfall statistics for the selected region. In a preprocessing step STEP-TRAMM models soil depth distribution and complements soil information that jointly capture key hydrological and mechanical properties relevant to local soil failure representation. In the presentation we will discuss feature of this publicly available platform and compare landslide and debris flow patterns for different regions considering representative intense rainfall events. Model outcomes will be compared for different spatial and temporal resolutions to test applicability of web-based information on elevation and rainfall for hazard assessment.

  7. DNA sequence-dependent mechanics and protein-assisted bending in repressor-mediated loop formation

    PubMed Central

    Boedicker, James Q.; Garcia, Hernan G.; Johnson, Stephanie; Phillips, Rob

    2014-01-01

    As the chief informational molecule of life, DNA is subject to extensive physical manipulations. The energy required to deform double-helical DNA depends on sequence, and this mechanical code of DNA influences gene regulation, such as through nucleosome positioning. Here we examine the sequence-dependent flexibility of DNA in bacterial transcription factor-mediated looping, a context for which the role of sequence remains poorly understood. Using a suite of synthetic constructs repressed by the Lac repressor and two well-known sequences that show large flexibility differences in vitro, we make precise statistical mechanical predictions as to how DNA sequence influences loop formation and test these predictions using in vivo transcription and in vitro single-molecule assays. Surprisingly, sequence-dependent flexibility does not affect in vivo gene regulation. By theoretically and experimentally quantifying the relative contributions of sequence and the DNA-bending protein HU to DNA mechanical properties, we reveal that bending by HU dominates DNA mechanics and masks intrinsic sequence-dependent flexibility. Such a quantitative understanding of how mechanical regulatory information is encoded in the genome will be a key step towards a predictive understanding of gene regulation at single-base pair resolution. PMID:24231252

  8. Does drug price-regulation affect healthcare expenditures?

    PubMed

    Ben-Aharon, Omer; Shavit, Oren; Magnezi, Racheli

    2017-09-01

    Increasing health costs in developed countries are a major concern for decision makers. A variety of cost containment tools are used to control this trend, including maximum price regulation and reimbursement methods for health technologies. Information regarding expenditure-related outcomes of these tools is not available. To evaluate the association between different cost-regulating mechanisms and national health expenditures in selected countries. Price-regulating and reimbursement mechanisms for prescription drugs among OECD countries were reviewed. National health expenditure indices for 2008-2012 were extracted from OECD statistical sources. Possible associations between characteristics of different systems for regulation of drug prices and reimbursement and health expenditures were examined. In most countries, reimbursement mechanisms are part of publicly financed plans. Maximum price regulation is composed of reference-pricing, either of the same drug in other countries, or of therapeutic alternatives within the country, as well as value-based pricing (VBP). No association was found between price regulation or reimbursement mechanisms and healthcare costs. However, VBP may present a more effective mechanism, leading to reduced costs in the long term. Maximum price and reimbursement mechanism regulations were not found to be associated with cost containment of national health expenditures. VBP may have the potential to do so over the long term.

  9. Elemental, microstructural, and mechanical characterization of high gold orthodontic brackets after intraoral aging.

    PubMed

    Hersche, Sepp; Sifakakis, Iosif; Zinelis, Spiros; Eliades, Theodore

    2017-02-01

    The purpose of the present study was to investigate the elemental composition, the microstructure, and the selected mechanical properties of high gold orthodontic brackets after intraoral aging. Thirty Incognito™ (3M Unitek, Bad Essen, Germany) lingual brackets were studied, 15 brackets as received (control group) and 15 brackets retrieved from different patients after orthodontic treatment. The surface of the wing area was examined by scanning electron microscopy (SEM). Backscattered electron imaging (BEI) was performed, and the elemental composition was determined by X-ray EDS analysis (EDX). After appropriate metallographic preparation, the mechanical properties tested were Martens hardness (HM), indentation modulus (EIT), elastic index (ηIT), and Vickers hardness (HV). These properties were determined employing instrumented indentation testing (IIT) with a Vickers indenter. The results were statistically analyzed by unpaired t-test (α=0.05). There were no statistically significant differences evidenced in surface morphology and elemental content between the control and the experimental group. These two groups of brackets showed no statistically significant difference in surface morphology. Moreover, the mean values of HM, EIT, ηIT, and HV did not reach statistical significance between the groups (p>0.05). Under the limitations of this study, it may be concluded that the surface elemental content and microstructure as well as the evaluated mechanical properties of the Incognito™ lingual brackets remain unaffected by intraoral aging.

  10. Sea spray production by bag breakup mode of fragmentation of the air-water interface at strong and hurricane wind

    NASA Astrophysics Data System (ADS)

    Troitskaya, Yuliya; Kandaurov, Alexander; Ermakova, Olga; Kozlov, Dmitry; Sergeev, Daniil; Zilitinkevich, Sergej

    2016-04-01

    Sea sprays is a typical element of the marine atmospheric boundary layer (MABL) of large importance for marine meteorology, atmospheric chemistry and climate studies. They are considered as a crucial factor in the development of hurricanes and severe extratropical storms, since they can significantly enhance exchange of mass, heat and momentum between the ocean and the atmosphere. This exchange is directly provided by spume droplets with the sizes from 10 microns to a few millimeters mechanically torn off the crests of a breaking waves and fall down to the ocean due to gravity. The fluxes associated with the spray are determined by the rate of droplet production at the surface quantified by the sea spray generation function (SSGF), defined as the number of spray particles of radius r produced from the unit area of water surface in unit time. However, the mechanism of spume droplets' formation is unknown and empirical estimates of SSGF varied over six orders of magnitude; therefore, the production rate of large sea spray droplets is not adequately described and there are significant uncertainties in estimations of exchange processes in hurricanes. Experimental core of our work comprise laboratory experiments employing high-speed video-filming, which have made it possible to disclose how water surface looks like at extremely strong winds and how exactly droplets are torn off wave crests. We classified events responsible for spume droplet, including bursting of submerged bubbles, generation and breakup of "projections" or liquid filaments (Koa, 1981) and "bag breakup", namely, inflating and consequent blowing of short-lived, sail-like pieces of the water-surface film, "bags". The process is similar to "bag-breakup" mode of fragmentation of liquid droplets and jets in gaseous flows. Basing on statistical analysis of results of these experiments we show that the main mechanism of spray-generation is attributed to "bag-breakup mechanism On the base of general principles of statistical physics (model of a canonical ensemble) we developed statistics of the "bag-breakup" events and constructed sea spray generation function (SSGF) for the mechanism of "bag-breakup". The "bag breakup" SSGF is shown to be in reasonable agreement in magnitude with SSFGs considered as the most reliable source function for spume droplets. The new SSGF is employed for estimate of the new"bag-breakup" mechanism to momentum and energy exchange in marine atmospheric boundary layer at hurricane conditions. This work was supported by the Russian Foundation of Basic Research (14-05-91767, 13-05-12093, 16-05-00839, 14-05-91767, 16-55-52025) and experiment and equipment was supported by Russian Science Foundation (Agreements 14-17-00667 and 15-17-20009 respectively), Yu.Troitskaya, A.Kandaurov and D.Sergeev were partially supported by FP7 collaborative Project No. 612610.

  11. Visual Attention Model Based on Statistical Properties of Neuron Responses

    PubMed Central

    Duan, Haibin; Wang, Xiaohua

    2015-01-01

    Visual attention is a mechanism of the visual system that can select relevant objects from a specific scene. Interactions among neurons in multiple cortical areas are considered to be involved in attentional allocation. However, the characteristics of the encoded features and neuron responses in those attention related cortices are indefinite. Therefore, further investigations carried out in this study aim at demonstrating that unusual regions arousing more attention generally cause particular neuron responses. We suppose that visual saliency is obtained on the basis of neuron responses to contexts in natural scenes. A bottom-up visual attention model is proposed based on the self-information of neuron responses to test and verify the hypothesis. Four different color spaces are adopted and a novel entropy-based combination scheme is designed to make full use of color information. Valuable regions are highlighted while redundant backgrounds are suppressed in the saliency maps obtained by the proposed model. Comparative results reveal that the proposed model outperforms several state-of-the-art models. This study provides insights into the neuron responses based saliency detection and may underlie the neural mechanism of early visual cortices for bottom-up visual attention. PMID:25747859

  12. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  13. Security of statistical data bases: invasion of privacy through attribute correlational modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palley, M.A.

    This study develops, defines, and applies a statistical technique for the compromise of confidential information in a statistical data base. Attribute Correlational Modeling (ACM) recognizes that the information contained in a statistical data base represents real world statistical phenomena. As such, ACM assumes correlational behavior among the database attributes. ACM proceeds to compromise confidential information through creation of a regression model, where the confidential attribute is treated as the dependent variable. The typical statistical data base may preclude the direct application of regression. In this scenario, the research introduces the notion of a synthetic data base, created through legitimate queriesmore » of the actual data base, and through proportional random variation of responses to these queries. The synthetic data base is constructed to resemble the actual data base as closely as possible in a statistical sense. ACM then applies regression analysis to the synthetic data base, and utilizes the derived model to estimate confidential information in the actual database.« less

  14. Fatigue Life Prediction of Fiber-Reinforced Ceramic-Matrix Composites with Different Fiber Preforms at Room and Elevated Temperatures

    PubMed Central

    Li, Longbiao

    2016-01-01

    In this paper, the fatigue life of fiber-reinforced ceramic-matrix composites (CMCs) with different fiber preforms, i.e., unidirectional, cross-ply, 2D (two dimensional), 2.5D and 3D CMCs at room and elevated temperatures in air and oxidative environments, has been predicted using the micromechanics approach. An effective coefficient of the fiber volume fraction along the loading direction (ECFL) was introduced to describe the fiber architecture of preforms. The statistical matrix multicracking model and fracture mechanics interface debonding criterion were used to determine the matrix crack spacing and interface debonded length. Under cyclic fatigue loading, the fiber broken fraction was determined by combining the interface wear model and fiber statistical failure model at room temperature, and interface/fiber oxidation model, interface wear model and fiber statistical failure model at elevated temperatures, based on the assumption that the fiber strength is subjected to two-parameter Weibull distribution and the load carried by broken and intact fibers satisfies the Global Load Sharing (GLS) criterion. When the broken fiber fraction approaches the critical value, the composites fatigue fracture. PMID:28773332

  15. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  16. Some intriguing aspects of multiparticle production processes

    NASA Astrophysics Data System (ADS)

    Wilk, Grzegorz; Włodarczyk, Zbigniew

    2018-04-01

    Multiparticle production processes provide valuable information about the mechanism of the conversion of the initial energy of projectiles into a number of secondaries by measuring their multiplicity distributions and their distributions in phase space. They therefore serve as a reference point for more involved measurements. Distributions in phase space are usually investigated using the statistical approach, very successful in general but failing in cases of small colliding systems, small multiplicities, and at the edges of the allowed phase space, in which cases the underlying dynamical effects competing with the statistical distributions take over. We discuss an alternative approach, which applies to the whole phase space without detailed knowledge of dynamics. It is based on a modification of the usual statistics by generalizing it to a superstatistical form. We stress particularly the scaling and self-similar properties of such an approach manifesting themselves as the phenomena of the log-periodic oscillations and oscillations of temperature caused by sound waves in hadronic matter. Concerning the multiplicity distributions we discuss in detail the phenomenon of the oscillatory behavior of the modified combinants apparently observed in experimental data.

  17. Mechanics: Statics; A Syllabus.

    ERIC Educational Resources Information Center

    Compo, Louis

    The instructor's guide presents material for structuring an engineering fundamentals course covering the basic laws of statistics as part of a mechanical technology program. Detailed behavioral objectives are described for the following five areas of course content: principles of mechanics, two-dimensional equilibrium, equilibrium of internal…

  18. An entropy-based statistic for genomewide association studies.

    PubMed

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-07-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.

  19. A direct correlation of x-ray diffraction orientation distributions to the in-plane stiffness of semi-crystalline organic semiconducting films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Bingxiao; Awartani, Omar; O'Connor, Brendan

    2016-05-02

    Large charge mobilities of semi-crystalline organic semiconducting films could be obtained by mechanically aligning the material phases of the film with the loading axis. A key element is to utilize the inherent stiffness of the material for optimal or desired alignment. However, experimentally determining the moduli of semi-crystalline organic thin films for different loading directions is difficult, if not impossible, due to film thickness and material anisotropy. In this paper, we address these challenges by presenting an approach based on combining a composite mechanics stiffness orientation formulation with a Gaussian statistical distribution to directly estimate the in-plane stiffness (transverse isotropy)more » of aligned semi-crystalline polymer films based on crystalline orientation distributions obtained by X-ray diffraction experimentally at different applied strains. Our predicted results indicate that the in-plane stiffness of an annealing film was initially isotropic, and then it evolved to transverse isotropy with increasing mechanical strains. This study underscores the significance of accounting for the crystalline orientation distributions of the film to obtain an accurate understanding and prediction of the elastic anisotropy of semi-crystalline polymer films.« less

  20. Agent-based model with multi-level herding for complex financial systems

    NASA Astrophysics Data System (ADS)

    Chen, Jun-Jie; Tan, Lei; Zheng, Bo

    2015-02-01

    In complex financial systems, the sector structure and volatility clustering are respectively important features of the spatial and temporal correlations. However, the microscopic generation mechanism of the sector structure is not yet understood. Especially, how to produce these two features in one model remains challenging. We introduce a novel interaction mechanism, i.e., the multi-level herding, in constructing an agent-based model to investigate the sector structure combined with volatility clustering. According to the previous market performance, agents trade in groups, and their herding behavior comprises the herding at stock, sector and market levels. Further, we propose methods to determine the key model parameters from historical market data, rather than from statistical fitting of the results. From the simulation, we obtain the sector structure and volatility clustering, as well as the eigenvalue distribution of the cross-correlation matrix, for the New York and Hong Kong stock exchanges. These properties are in agreement with the empirical ones. Our results quantitatively reveal that the multi-level herding is the microscopic generation mechanism of the sector structure, and provide new insight into the spatio-temporal interactions in financial systems at the microscopic level.

Top