Sample records for statistical mechanical model

  1. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  2. A Conditional Curie-Weiss Model for Stylized Multi-group Binary Choice with Social Interaction

    NASA Astrophysics Data System (ADS)

    Opoku, Alex Akwasi; Edusei, Kwame Owusu; Ansah, Richard Kwame

    2018-04-01

    This paper proposes a conditional Curie-Weiss model as a model for decision making in a stylized society made up of binary decision makers that face a particular dichotomous choice between two options. Following Brock and Durlauf (Discrete choice with social interaction I: theory, 1955), we set-up both socio-economic and statistical mechanical models for the choice problem. We point out when both the socio-economic and statistical mechanical models give rise to the same self-consistent equilibrium mean choice level(s). Phase diagram of the associated statistical mechanical model and its socio-economic implications are discussed.

  3. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  4. Non-equilibrium dog-flea model

    NASA Astrophysics Data System (ADS)

    Ackerson, Bruce J.

    2017-11-01

    We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.

  5. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  6. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    ERIC Educational Resources Information Center

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  7. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  8. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  9. Statistical mechanics of the Huxley-Simmons model

    NASA Astrophysics Data System (ADS)

    Caruel, M.; Truskinovsky, L.

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  10. Statistical Mechanical Foundation for the Two-State Transition in Protein Folding of Small Globular Proteins

    NASA Astrophysics Data System (ADS)

    Iguchi, Kazumoto

    We discuss the statistical mechanical foundation for the two-state transition in the protein folding of small globular proteins. In the standard arguments of protein folding, the statistical search for the ground state is carried out from astronomically many conformations in the configuration space. This leads us to the famous Levinthal's paradox. To resolve the paradox, Gō first postulated that the two-state transition - all-or-none type transition - is very crucial for the protein folding of small globular proteins and used the Gō's lattice model to show the two-state transition nature. Recently, there have been accumulated many experimental results that support the two-state transition for small globular proteins. Stimulated by such recent experiments, Zwanzig has introduced a minimal statistical mechanical model that exhibits the two-state transition. Also, Finkelstein and coworkers have discussed the solution of the paradox by considering the sequential folding of a small globular protein. On the other hand, recently Iguchi have introduced a toy model of protein folding using the Rubik's magic snake model, in which all folded structures are exactly known and mathematically represented in terms of the four types of conformations: cis-, trans-, left and right gauche-configurations between the unit polyhedrons. In this paper, we study the relationship between the Gō's two-state transition, the Zwanzig's statistical mechanics model and the Finkelsteinapos;s sequential folding model by applying them to the Rubik's magic snake models. We show that the foundation of the Gō's two-state transition model relies on the search within the equienergy surface that is labeled by the contact order of the hydrophobic condensation. This idea reproduces the Zwanzig's statistical model as a special case, realizes the Finkelstein's sequential folding model and fits together to understand the nature of the two-state transition of a small globular protein by calculating the physical quantities such as the free energy, the contact order and the specific heat. We point out the similarity between the liquid-gas transition in statistical mechanics and the two-state transition of protein folding. We also study morphology of the Rubik's magic snake models to give a prototype model for understanding the differences between α-helices proteins and β-sheets proteins.

  11. Statistical models and NMR analysis of polymer microstructure

    USDA-ARS?s Scientific Manuscript database

    Statistical models can be used in conjunction with NMR spectroscopy to study polymer microstructure and polymerization mechanisms. Thus, Bernoullian, Markovian, and enantiomorphic-site models are well known. Many additional models have been formulated over the years for additional situations. Typica...

  12. Statistical Mechanics of Prion Diseases

    NASA Astrophysics Data System (ADS)

    Slepoy, A.; Singh, R. R.; Pázmándi, F.; Kulkarni, R. V.; Cox, D. L.

    2001-07-01

    We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ``species barriers'' to prion infection and assess a related treatment protocol.

  13. Strongly magnetized classical plasma models

    NASA Technical Reports Server (NTRS)

    Montgomery, D. C.

    1972-01-01

    The class of plasma processes for which the so-called Vlasov approximation is inadequate is investigated. Results from the equilibrium statistical mechanics of two-dimensional plasmas are derived. These results are independent of the presence of an external dc magnetic field. The nonequilibrium statistical mechanics of the electrostatic guiding-center plasma, a two-dimensional plasma model, is discussed. This model is then generalized to three dimensions. The guiding-center model is relaxed to include finite Larmor radius effects for a two-dimensional plasma.

  14. Metamodelling Messages Conveyed in Five Statistical Mechanical Textbooks from 1936 to 2001

    ERIC Educational Resources Information Center

    Niss, Martin

    2009-01-01

    Modelling is a significant aspect of doing physics and it is important how this activity is taught. This paper focuses on the explicit or implicit messages about modelling conveyed to the student in the treatments of phase transitions in statistical mechanics textbooks at beginning graduate level. Five textbooks from the 1930s to the present are…

  15. Statistical mechanics of protein structural transitions: Insights from the island model

    PubMed Central

    Kobayashi, Yukio

    2016-01-01

    The so-called island model of protein structural transition holds that hydrophobic interactions are the key to both the folding and function of proteins. Herein, the genesis and statistical mechanical basis of the island model of transitions are reviewed, by presenting the results of simulations of such transitions. Elucidating the physicochemical mechanism of protein structural formation is the foundation for understanding the hierarchical structure of life at the microscopic level. Based on the results obtained to date using the island model, remaining problems and future work in the field of protein structures are discussed, referencing Professor Saitô’s views on the hierarchic structure of science. PMID:28409078

  16. A new approach to fracture modelling in reservoirs using deterministic, genetic and statistical models of fracture growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rawnsley, K.; Swaby, P.

    1996-08-01

    It is increasingly acknowledged that in order to understand and forecast the behavior of fracture influenced reservoirs we must attempt to reproduce the fracture system geometry and use this as a basis for fluid flow calculation. This article aims to present a recently developed fracture modelling prototype designed specifically for use in hydrocarbon reservoir environments. The prototype {open_quotes}FRAME{close_quotes} (FRActure Modelling Environment) aims to provide a tool which will allow the generation of realistic 3D fracture systems within a reservoir model, constrained to the known geology of the reservoir by both mechanical and statistical considerations, and which can be used asmore » a basis for fluid flow calculation. Two newly developed modelling techniques are used. The first is an interactive tool which allows complex fault surfaces and their associated deformations to be reproduced. The second is a {open_quotes}genetic{close_quotes} model which grows fracture patterns from seeds using conceptual models of fracture development. The user defines the mechanical input and can retrieve all the statistics of the growing fractures to allow comparison to assumed statistical distributions for the reservoir fractures. Input parameters include growth rate, fracture interaction characteristics, orientation maps and density maps. More traditional statistical stochastic fracture models are also incorporated. FRAME is designed to allow the geologist to input hard or soft data including seismically defined surfaces, well fractures, outcrop models, analogue or numerical mechanical models or geological {open_quotes}feeling{close_quotes}. The geologist is not restricted to {open_quotes}a priori{close_quotes} models of fracture patterns that may not correspond to the data.« less

  17. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    PubMed

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  18. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    NASA Astrophysics Data System (ADS)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  19. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  20. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE PAGES

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...

    2016-03-09

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  1. Mechanics and statistics of the worm-like chain

    NASA Astrophysics Data System (ADS)

    Marantan, Andrew; Mahadevan, L.

    2018-02-01

    The worm-like chain model is a simple continuum model for the statistical mechanics of a flexible polymer subject to an external force. We offer a tutorial introduction to it using three approaches. First, we use a mesoscopic view, treating a long polymer (in two dimensions) as though it were made of many groups of correlated links or "clinks," allowing us to calculate its average extension as a function of the external force via scaling arguments. We then provide a standard statistical mechanics approach, obtaining the average extension by two different means: the equipartition theorem and the partition function. Finally, we work in a probabilistic framework, taking advantage of the Gaussian properties of the chain in the large-force limit to improve upon the previous calculations of the average extension.

  2. Statistical mechanics of simple models of protein folding and design.

    PubMed Central

    Pande, V S; Grosberg, A Y; Tanaka, T

    1997-01-01

    It is now believed that the primary equilibrium aspects of simple models of protein folding are understood theoretically. However, current theories often resort to rather heavy mathematics to overcome some technical difficulties inherent in the problem or start from a phenomenological model. To this end, we take a new approach in this pedagogical review of the statistical mechanics of protein folding. The benefit of our approach is a drastic mathematical simplification of the theory, without resort to any new approximations or phenomenological prescriptions. Indeed, the results we obtain agree precisely with previous calculations. Because of this simplification, we are able to present here a thorough and self contained treatment of the problem. Topics discussed include the statistical mechanics of the random energy model (REM), tests of the validity of REM as a model for heteropolymer freezing, freezing transition of random sequences, phase diagram of designed ("minimally frustrated") sequences, and the degree to which errors in the interactions employed in simulations of either folding and design can still lead to correct folding behavior. Images FIGURE 2 FIGURE 3 FIGURE 4 FIGURE 6 PMID:9414231

  3. Linking Mechanics and Statistics in Epidermal Tissues

    NASA Astrophysics Data System (ADS)

    Kim, Sangwoo; Hilgenfeldt, Sascha

    2015-03-01

    Disordered cellular structures, such as foams, polycrystals, or living tissues, can be characterized by quantitative measurements of domain size and topology. In recent work, we showed that correlations between size and topology in 2D systems are sensitive to the shape (eccentricity) of the individual domains: From a local model of neighbor relations, we derived an analytical justification for the famous empirical Lewis law, confirming the theory with experimental data from cucumber epidermal tissue. Here, we go beyond this purely geometrical model and identify mechanical properties of the tissue as the root cause for the domain eccentricity and thus the statistics of tissue structure. The simple model approach is based on the minimization of an interfacial energy functional. Simulations with Surface Evolver show that the domain statistics depend on a single mechanical parameter, while parameter fluctuations from cell to cell play an important role in simultaneously explaining the shape distribution of cells. The simulations are in excellent agreement with experiments and analytical theory, and establish a general link between the mechanical properties of a tissue and its structure. The model is relevant to diagnostic applications in a variety of animal and plant tissues.

  4. Modeling of adsorption isotherms of water vapor on Tunisian olive leaves using statistical mechanical formulation

    NASA Astrophysics Data System (ADS)

    Knani, S.; Aouaini, F.; Bahloul, N.; Khalfaoui, M.; Hachicha, M. A.; Ben Lamine, A.; Kechaou, N.

    2014-04-01

    Analytical expression for modeling water adsorption isotherms of food or agricultural products is developed using the statistical mechanics formalism. The model developed in this paper is further used to fit and interpret the isotherms of four varieties of Tunisian olive leaves called “Chemlali, Chemchali, Chetoui and Zarrazi”. The parameters involved in the model such as the number of adsorbed water molecules per site, n, the receptor sites density, NM, and the energetic parameters, a1 and a2, were determined by fitting the experimental adsorption isotherms at temperatures ranging from 303 to 323 K. We interpret the results of fitting. After that, the model is further applied to calculate thermodynamic functions which govern the adsorption mechanism such as entropy, the free enthalpy of Gibbs and the internal energy.

  5. Physical concepts in the development of constitutive equations

    NASA Technical Reports Server (NTRS)

    Cassenti, B. N.

    1985-01-01

    Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.

  6. Stochastical modeling for Viral Disease: Statistical Mechanics and Network Theory

    NASA Astrophysics Data System (ADS)

    Zhou, Hao; Deem, Michael

    2007-04-01

    Theoretical methods of statistical mechanics are developed and applied to study the immunological response against viral disease, such as dengue. We use this theory to show how the immune response to four different dengue serotypes may be sculpted. It is the ability of avian influenza, to change and to mix, that has given rise to the fear of a new human flu pandemic. Here we propose to utilize a scale free network based stochastic model to investigate the mitigation strategies and analyze the risk.

  7. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  8. A statistical mechanics model for free-for-all airplane passenger boarding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steffen, Jason H.; /Fermilab

    2008-08-01

    I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. Themore » model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty.« less

  9. Effect of CorrelatedRotational Noise

    NASA Astrophysics Data System (ADS)

    Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna

    The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.

  10. Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping

    NASA Astrophysics Data System (ADS)

    Kubica, Aleksander; Beverland, Michael E.; Brandão, Fernando; Preskill, John; Svore, Krysta M.

    2018-05-01

    Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p3DCC (1 )≃1.9 % and p3DCC (2 )≃27.6 % . We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.

  11. A Role for Chunk Formation in Statistical Learning of Second Language Syntax

    ERIC Educational Resources Information Center

    Hamrick, Phillip

    2014-01-01

    Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by…

  12. Unperturbed Schelling Segregation in Two or Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2016-09-01

    Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).

  13. Statistical Analysis of Crystallization Database Links Protein Physico-Chemical Features with Crystallization Mechanisms

    PubMed Central

    Fusco, Diana; Barnum, Timothy J.; Bruno, Andrew E.; Luft, Joseph R.; Snell, Edward H.; Mukherjee, Sayan; Charbonneau, Patrick

    2014-01-01

    X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis. PMID:24988076

  14. Statistical analysis of crystallization database links protein physico-chemical features with crystallization mechanisms.

    PubMed

    Fusco, Diana; Barnum, Timothy J; Bruno, Andrew E; Luft, Joseph R; Snell, Edward H; Mukherjee, Sayan; Charbonneau, Patrick

    2014-01-01

    X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.

  15. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  16. Fundamentals of poly(lactic acid) microstructure, crystallization behavior, and properties

    NASA Astrophysics Data System (ADS)

    Kang, Shuhui

    Poly(lactic acid) is an environmentally-benign biodegradable and sustainable thermoplastic material, which has found broad applications as food packaging films and as non-woven fibers. The crystallization and deformation mechanisms of the polymer are largely determined by the distribution of conformation and configuration. Knowledge of these mechanisms is needed to understand the mechanical and thermal properties on which processing conditions mainly depend. In conjunction with laser light scattering, Raman spectroscopy and normal coordinate analysis are used in this thesis to elucidate these properties. Vibrational spectroscopic theory, Flory's rotational isomeric state (RIS) theory, Gaussian chain statistics and statistical mechanics are used to relate experimental data to molecular chain structure. A refined RIS model is proposed, chain rigidity recalculated and chain statistics discussed. A Raman spectroscopic characterization method for crystalline and amorphous phase orientation has been developed. A shrinkage model is also proposed to interpret the dimensional stability for fibers and uni- or biaxially stretched films. A study of stereocomplexation formed by poly(l-lactic acid) and poly(d-lactic acid) is also presented.

  17. Unified risk analysis of fatigue failure in ductile alloy components during all three stages of fatigue crack evolution process.

    PubMed

    Patankar, Ravindra

    2003-10-01

    Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.

  18. Modeling Cross-Situational Word–Referent Learning: Prior Questions

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2013-01-01

    Both adults and young children possess powerful statistical computation capabilities—they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of associative learning. This article describes a series of simulation studies and analyses designed to understand the different learning mechanisms posited by the 2 classes of models and their relation to each other. Variants of a hypothesis-testing model and a simple or dumb associative mechanism were examined under different specifications of information selection, computation, and decision. Critically, these 3 components of the models interact in complex ways. The models illustrate a fundamental tradeoff between amount of data input and powerful computations: With the selection of more information, dumb associative models can mimic the powerful learning that is accomplished by hypothesis-testing models with fewer data. However, because of the interactions among the component parts of the models, the associative model can mimic various hypothesis-testing models, producing the same learning patterns but through different internal components. The simulations argue for the importance of a compositional approach to human statistical learning: the experimental decomposition of the processes that contribute to statistical learning in human learners and models with the internal components that can be evaluated independently and together. PMID:22229490

  19. Correlations Between Bone Mechanical Properties and Bone Composition Parameters in Mouse Models of Dominant and Recessive Osteogenesis Imperfecta and the Response to Anti-TGF-β Treatment.

    PubMed

    Bi, Xiaohong; Grafe, Ingo; Ding, Hao; Flores, Rene; Munivez, Elda; Jiang, Ming Ming; Dawson, Brian; Lee, Brendan; Ambrose, Catherine G

    2017-02-01

    Osteogenesis imperfecta (OI) is a group of genetic disorders characterized by brittle bones that are prone to fracture. Although previous studies in animal models investigated the mechanical properties and material composition of OI bone, little work has been conducted to statistically correlate these parameters to identify key compositional contributors to the impaired bone mechanical behaviors in OI. Further, although increased TGF-β signaling has been demonstrated as a contributing mechanism to the bone pathology in OI models, the relationship between mechanical properties and bone composition after anti-TGF-β treatment in OI has not been studied. Here, we performed follow-up analyses of femurs collected in an earlier study from OI mice with and without anti-TGF-β treatment from both recessive (Crtap -/- ) and dominant (Col1a2 +/P.G610C ) OI mouse models and WT mice. Mechanical properties were determined using three-point bending tests and evaluated for statistical correlation with molecular composition in bone tissue assessed by Raman spectroscopy. Statistical regression analysis was conducted to determine significant compositional determinants of mechanical integrity. Interestingly, we found differences in the relationships between bone composition and mechanical properties and in the response to anti-TGF-β treatment. Femurs of both OI models exhibited increased brittleness, which was associated with reduced collagen content and carbonate substitution. In the Col1a2 +/P.G610C femurs, reduced hydroxyapatite crystallinity was also found to be associated with increased brittleness, and increased mineral-to-collagen ratio was correlated with increased ultimate strength, elastic modulus, and bone brittleness. In both models of OI, regression analysis demonstrated that collagen content was an important predictor of the increased brittleness. In summary, this work provides new insights into the relationships between bone composition and material properties in models of OI, identifies key bone compositional parameters that correlate with the impaired mechanical integrity of OI bone, and explores the effects of anti-TGF-β treatment on bone-quality parameters in these models. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  20. A statistical mechanics approach to autopoietic immune networks

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Agliari, Elena

    2010-07-01

    In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.

  1. Ballistic and diffusive dynamics in a two-dimensional ideal gas of macroscopic chaotic Faraday waves.

    PubMed

    Welch, Kyle J; Hastings-Hauss, Isaac; Parthasarathy, Raghuveer; Corwin, Eric I

    2014-04-01

    We have constructed a macroscopic driven system of chaotic Faraday waves whose statistical mechanics, we find, are surprisingly simple, mimicking those of a thermal gas. We use real-time tracking of a single floating probe, energy equipartition, and the Stokes-Einstein relation to define and measure a pseudotemperature and diffusion constant and then self-consistently determine a coefficient of viscous friction for a test particle in this pseudothermal gas. Because of its simplicity, this system can serve as a model for direct experimental investigation of nonequilibrium statistical mechanics, much as the ideal gas epitomizes equilibrium statistical mechanics.

  2. Three-Dimensional Color Code Thresholds via Statistical-Mechanical Mapping.

    PubMed

    Kubica, Aleksander; Beverland, Michael E; Brandão, Fernando; Preskill, John; Svore, Krysta M

    2018-05-04

    Three-dimensional (3D) color codes have advantages for fault-tolerant quantum computing, such as protected quantum gates with relatively low overhead and robustness against imperfect measurement of error syndromes. Here we investigate the storage threshold error rates for bit-flip and phase-flip noise in the 3D color code (3DCC) on the body-centered cubic lattice, assuming perfect syndrome measurements. In particular, by exploiting a connection between error correction and statistical mechanics, we estimate the threshold for 1D stringlike and 2D sheetlike logical operators to be p_{3DCC}^{(1)}≃1.9% and p_{3DCC}^{(2)}≃27.6%. We obtain these results by using parallel tempering Monte Carlo simulations to study the disorder-temperature phase diagrams of two new 3D statistical-mechanical models: the four- and six-body random coupling Ising models.

  3. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation.

    PubMed

    Finnerty, Justin John; Peyser, Alexander; Carloni, Paolo

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores.

  4. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  5. Statistical-mechanics theory of active mode locking with noise.

    PubMed

    Gordon, Ariel; Fischer, Baruch

    2004-05-01

    Actively mode-locked lasers with noise are studied employing statistical mechanics. A mapping of the system to the spherical model (related to the Ising model) of ferromagnets in one dimension that has an exact solution is established. It gives basic features, such as analytical expressions for the correlation function between modes, and the widths and shapes of the pulses [different from the Kuizenga-Siegman expression; IEEE J. Quantum Electron. QE-6, 803 (1970)] and reveals the susceptibility to noise of mode ordering compared with passive mode locking.

  6. A first principles calculation and statistical mechanics modeling of defects in Al-H system

    NASA Astrophysics Data System (ADS)

    Ji, Min; Wang, Cai-Zhuang; Ho, Kai-Ming

    2007-03-01

    The behavior of defects and hydrogen in Al was investigated by first principles calculations and statistical mechanics modeling. The formation energy of different defects in Al+H system such as Al vacancy, H in institution and multiple H in Al vacancy were calculated by first principles method. Defect concentration in thermodynamical equilibrium was studied by total free energy calculation including configuration entropy and defect-defect interaction from low concentration limit to hydride limit. In our grand canonical ensemble model, hydrogen chemical potential under different environment plays an important role in determing the defect concentration and properties in Al-H system.

  7. Evolution of cosmic string networks

    NASA Technical Reports Server (NTRS)

    Albrecht, Andreas; Turok, Neil

    1989-01-01

    A discussion of the evolution and observable consequences of a network of cosmic strings is given. A simple model for the evolution of the string network is presented, and related to the statistical mechanics of string networks. The model predicts the long string density throughout the history of the universe from a single parameter, which researchers calculate in radiation era simulations. The statistical mechanics arguments indicate a particular thermal form for the spectrum of loops chopped off the network. Detailed numerical simulations of string networks in expanding backgrounds are performed to test the model. Consequences for large scale structure, the microwave and gravity wave backgrounds, nucleosynthesis and gravitational lensing are calculated.

  8. Statistical shear lag model - unraveling the size effect in hierarchical composites.

    PubMed

    Wei, Xiaoding; Filleter, Tobin; Espinosa, Horacio D

    2015-05-01

    Numerous experimental and computational studies have established that the hierarchical structures encountered in natural materials, such as the brick-and-mortar structure observed in sea shells, are essential for achieving defect tolerance. Due to this hierarchy, the mechanical properties of natural materials have a different size dependence compared to that of typical engineered materials. This study aimed to explore size effects on the strength of bio-inspired staggered hierarchical composites and to define the influence of the geometry of constituents in their outstanding defect tolerance capability. A statistical shear lag model is derived by extending the classical shear lag model to account for the statistics of the constituents' strength. A general solution emerges from rigorous mathematical derivations, unifying the various empirical formulations for the fundamental link length used in previous statistical models. The model shows that the staggered arrangement of constituents grants composites a unique size effect on mechanical strength in contrast to homogenous continuous materials. The model is applied to hierarchical yarns consisting of double-walled carbon nanotube bundles to assess its predictive capabilities for novel synthetic materials. Interestingly, the model predicts that yarn gauge length does not significantly influence the yarn strength, in close agreement with experimental observations. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  9. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor.

    PubMed

    Mansour, Joseph M; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D; Liu, Yiying; Welter, Jean F

    2014-10-01

    Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. The statistical model generally predicted the Young's moduli in compression to within <10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor.

  10. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor

    PubMed Central

    Mansour, Joseph M.; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D.; Liu, Yiying; Welter, Jean F.

    2016-01-01

    Introduction Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Methods Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. Results The statistical model generally predicted the Young's moduli in compression to within < 10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Conclusions Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor. PMID:25092421

  11. Six new mechanics corresponding to further shape theories

    NASA Astrophysics Data System (ADS)

    Anderson, Edward

    2016-02-01

    In this paper, suite of relational notions of shape are presented at the level of configuration space geometry, with corresponding new theories of shape mechanics and shape statistics. These further generalize two quite well known examples: (i) Kendall’s (metric) shape space with his shape statistics and Barbour’s mechanics thereupon. (ii) Leibnizian relational space alias metric scale-and-shape space to which corresponds Barbour-Bertotti mechanics. This paper’s new theories include, using the invariant and group namings, (iii) Angle alias conformal shape mechanics. (iv) Area ratio alias e shape mechanics. (v) Area alias e scale-and-shape mechanics. (iii)-(v) rest respectively on angle space, area-ratio space, and area space configuration spaces. Probability and statistics applications are also pointed to in outline. (vi) Various supersymmetric counterparts of (i)-(v) are considered. Since supergravity differs considerably from GR-based conceptions of background independence, some of the new supersymmetric shape mechanics are compared with both. These reveal compatibility between supersymmetry and GR-based conceptions of background independence, at least within these simpler model arenas.

  12. Statistical Mechanics of Coherent Ising Machine — The Case of Ferromagnetic and Finite-Loading Hopfield Models —

    NASA Astrophysics Data System (ADS)

    Aonishi, Toru; Mimura, Kazushi; Utsunomiya, Shoko; Okada, Masato; Yamamoto, Yoshihisa

    2017-10-01

    The coherent Ising machine (CIM) has attracted attention as one of the most effective Ising computing architectures for solving large scale optimization problems because of its scalability and high-speed computational ability. However, it is difficult to implement the Ising computation in the CIM because the theories and techniques of classical thermodynamic equilibrium Ising spin systems cannot be directly applied to the CIM. This means we have to adapt these theories and techniques to the CIM. Here we focus on a ferromagnetic model and a finite loading Hopfield model, which are canonical models sharing a common mathematical structure with almost all other Ising models. We derive macroscopic equations to capture nonequilibrium phase transitions in these models. The statistical mechanical methods developed here constitute a basis for constructing evaluation methods for other Ising computation models.

  13. Improving Our Ability to Evaluate Underlying Mechanisms of Behavioral Onset and Other Event Occurrence Outcomes: A Discrete-Time Survival Mediation Model

    PubMed Central

    Fairchild, Amanda J.; Abara, Winston E.; Gottschall, Amanda C.; Tein, Jenn-Yun; Prinz, Ronald J.

    2015-01-01

    The purpose of this article is to introduce and describe a statistical model that researchers can use to evaluate underlying mechanisms of behavioral onset and other event occurrence outcomes. Specifically, the article develops a framework for estimating mediation effects with outcomes measured in discrete-time epochs by integrating the statistical mediation model with discrete-time survival analysis. The methodology has the potential to help strengthen health research by targeting prevention and intervention work more effectively as well as by improving our understanding of discretized periods of risk. The model is applied to an existing longitudinal data set to demonstrate its use, and programming code is provided to facilitate its implementation. PMID:24296470

  14. Impact resistance of fiber composites - Energy-absorbing mechanisms and environmental effects

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.

    1985-01-01

    Energy absorbing mechanisms were identified by several approaches. The energy absorbing mechanisms considered are those in unidirectional composite beams subjected to impact. The approaches used include: mechanic models, statistical models, transient finite element analysis, and simple beam theory. Predicted results are correlated with experimental data from Charpy impact tests. The environmental effects on impact resistance are evaluated. Working definitions for energy absorbing and energy releasing mechanisms are proposed and a dynamic fracture progression is outlined. Possible generalizations to angle-plied laminates are described.

  15. Impact resistance of fiber composites: Energy absorbing mechanisms and environmental effects

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.

    1983-01-01

    Energy absorbing mechanisms were identified by several approaches. The energy absorbing mechanisms considered are those in unidirectional composite beams subjected to impact. The approaches used include: mechanic models, statistical models, transient finite element analysis, and simple beam theory. Predicted results are correlated with experimental data from Charpy impact tests. The environmental effects on impact resistance are evaluated. Working definitions for energy absorbing and energy releasing mechanisms are proposed and a dynamic fracture progression is outlined. Possible generalizations to angle-plied laminates are described.

  16. Evaluation of high-resolution sea ice models on the basis of statistical and scaling properties of Arctic sea ice drift and deformation

    NASA Astrophysics Data System (ADS)

    Girard, L.; Weiss, J.; Molines, J. M.; Barnier, B.; Bouillon, S.

    2009-08-01

    Sea ice drift and deformation from models are evaluated on the basis of statistical and scaling properties. These properties are derived from two observation data sets: the RADARSAT Geophysical Processor System (RGPS) and buoy trajectories from the International Arctic Buoy Program (IABP). Two simulations obtained with the Louvain-la-Neuve Ice Model (LIM) coupled to a high-resolution ocean model and a simulation obtained with the Los Alamos Sea Ice Model (CICE) were analyzed. Model ice drift compares well with observations in terms of large-scale velocity field and distributions of velocity fluctuations although a significant bias on the mean ice speed is noted. On the other hand, the statistical properties of ice deformation are not well simulated by the models: (1) The distributions of strain rates are incorrect: RGPS distributions of strain rates are power law tailed, i.e., exhibit "wild randomness," whereas models distributions remain in the Gaussian attraction basin, i.e., exhibit "mild randomness." (2) The models are unable to reproduce the spatial and temporal correlations of the deformation fields: In the observations, ice deformation follows spatial and temporal scaling laws that express the heterogeneity and the intermittency of deformation. These relations do not appear in simulated ice deformation. Mean deformation in models is almost scale independent. The statistical properties of ice deformation are a signature of the ice mechanical behavior. The present work therefore suggests that the mechanical framework currently used by models is inappropriate. A different modeling framework based on elastic interactions could improve the representation of the statistical and scaling properties of ice deformation.

  17. Complete integrability of information processing by biochemical reactions

    PubMed Central

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-01-01

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling – based on spin systems – has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis–Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy – based on completely integrable hydrodynamic-type systems of PDEs – which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions. PMID:27812018

  18. Complete integrability of information processing by biochemical reactions

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-11-01

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.

  19. Complete integrability of information processing by biochemical reactions.

    PubMed

    Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio

    2016-11-04

    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.

  20. Random walk to a nonergodic equilibrium concept

    NASA Astrophysics Data System (ADS)

    Bel, G.; Barkai, E.

    2006-01-01

    Random walk models, such as the trap model, continuous time random walks, and comb models, exhibit weak ergodicity breaking, when the average waiting time is infinite. The open question is, what statistical mechanical theory replaces the canonical Boltzmann-Gibbs theory for such systems? In this paper a nonergodic equilibrium concept is investigated, for a continuous time random walk model in a potential field. In particular we show that in the nonergodic phase the distribution of the occupation time of the particle in a finite region of space approaches U- or W-shaped distributions related to the arcsine law. We show that when conditions of detailed balance are applied, these distributions depend on the partition function of the problem, thus establishing a relation between the nonergodic dynamics and canonical statistical mechanics. In the ergodic phase the distribution function of the occupation times approaches a δ function centered on the value predicted based on standard Boltzmann-Gibbs statistics. The relation of our work to single-molecule experiments is briefly discussed.

  1. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation

    PubMed Central

    Finnerty, Justin John

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores. PMID:26460827

  2. Statistical mechanics of soft-boson phase transitions

    NASA Technical Reports Server (NTRS)

    Gupta, Arun K.; Hill, Christopher T.; Holman, Richard; Kolb, Edward W.

    1991-01-01

    The existence of structure on large (100 Mpc) scales, and limits to anisotropies in the cosmic microwave background radiation (CMBR), have imperiled models of structure formation based solely upon the standard cold dark matter scenario. Novel scenarios, which may be compatible with large scale structure and small CMBR anisotropies, invoke nonlinear fluctuations in the density appearing after recombination, accomplished via the use of late time phase transitions involving ultralow mass scalar bosons. Herein, the statistical mechanics are studied of such phase transitions in several models involving naturally ultralow mass pseudo-Nambu-Goldstone bosons (pNGB's). These models can exhibit several interesting effects at high temperature, which is believed to be the most general possibilities for pNGB's.

  3. Quantum Social Science

    NASA Astrophysics Data System (ADS)

    Haven, Emmanuel; Khrennikov, Andrei

    2013-01-01

    Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.

  4. Statistics and classification of the microwave zebra patterns associated with solar flares

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Baolin; Tan, Chengming; Zhang, Yin

    2014-01-10

    The microwave zebra pattern (ZP) is the most interesting, intriguing, and complex spectral structure frequently observed in solar flares. A comprehensive statistical study will certainly help us to understand the formation mechanism, which is not exactly clear now. This work presents a comprehensive statistical analysis of a big sample with 202 ZP events collected from observations at the Chinese Solar Broadband Radio Spectrometer at Huairou and the Ondŕejov Radiospectrograph in the Czech Republic at frequencies of 1.00-7.60 GHz from 2000 to 2013. After investigating the parameter properties of ZPs, such as the occurrence in flare phase, frequency range, polarization degree,more » duration, etc., we find that the variation of zebra stripe frequency separation with respect to frequency is the best indicator for a physical classification of ZPs. Microwave ZPs can be classified into three types: equidistant ZPs, variable-distant ZPs, and growing-distant ZPs, possibly corresponding to mechanisms of the Bernstein wave model, whistler wave model, and double plasma resonance model, respectively. This statistical classification may help us to clarify the controversies between the existing various theoretical models and understand the physical processes in the source regions.« less

  5. Twenty-five years of maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  6. Evaluating pictogram prediction in a location-aware augmentative and alternative communication system.

    PubMed

    Garcia, Luís Filipe; de Oliveira, Luís Caldas; de Matos, David Martins

    2016-01-01

    This study compared the performance of two statistical location-aware pictogram prediction mechanisms, with an all-purpose (All) pictogram prediction mechanism, having no location knowledge. The All approach had a unique language model under all locations. One of the location-aware alternatives, the location-specific (Spec) approach, made use of specific language models for pictogram prediction in each location of interest. The other location-aware approach resulted from combining the Spec and the All approaches, and was designated the mixed approach (Mix). In this approach, the language models acquired knowledge from all locations, but a higher relevance was assigned to the vocabulary from the associated location. Results from simulations showed that the Mix and Spec approaches could only outperform the baseline in a statistically significant way if pictogram users reuse more than 50% and 75% of their sentences, respectively. Under low sentence reuse conditions there were no statistically significant differences between the location-aware approaches and the All approach. Under these conditions, the Mix approach performed better than the Spec approach in a statistically significant way.

  7. Elastic Network Models For Biomolecular Dynamics: Theory and Application to Membrane Proteins and Viruses

    NASA Astrophysics Data System (ADS)

    Lezon, Timothy R.; Shrivastava, Indira H.; Yang, Zheng; Bahar, Ivet

    The following sections are included: * Introduction * Theory and Assumptions * Statistical mechanical foundations * Anisotropic network models * Gaussian network model * Rigid block models * Treatment of perturbations * Langevin dynamics * Applications * Membrane proteins * Viruses * Conclusion * References

  8. Systems Models for Transportation Problems : Volume 1. Introducing a Systems Science for Transportation Planning.

    DOT National Transportation Integrated Search

    1976-03-01

    This introductory portion of a system science for tranportation planning, which is based on the statistical physics of ensembles, a foundations laid on how statistical mechanics, equilibrium thermodynamics, and near equilbrium thermodynamics can be u...

  9. Uterus models for use in virtual reality hysteroscopy simulators.

    PubMed

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  10. Properties of a memory network in psychology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wedemann, Roseli S.; Donangelo, Raul; Carvalho, Luis A. V. de

    We have previously described neurotic psychopathology and psychoanalytic working-through by an associative memory mechanism, based on a neural network model, where memory was modelled by a Boltzmann machine (BM). Since brain neural topology is selectively structured, we simulated known microscopic mechanisms that control synaptic properties, showing that the network self-organizes to a hierarchical, clustered structure. Here, we show some statistical mechanical properties of the complex networks which result from this self-organization. They indicate that a generalization of the BM may be necessary to model memory.

  11. Properties of a memory network in psychology

    NASA Astrophysics Data System (ADS)

    Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.

    2007-12-01

    We have previously described neurotic psychopathology and psychoanalytic working-through by an associative memory mechanism, based on a neural network model, where memory was modelled by a Boltzmann machine (BM). Since brain neural topology is selectively structured, we simulated known microscopic mechanisms that control synaptic properties, showing that the network self-organizes to a hierarchical, clustered structure. Here, we show some statistical mechanical properties of the complex networks which result from this self-organization. They indicate that a generalization of the BM may be necessary to model memory.

  12. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  13. Analyzing Single-Molecule Protein Transportation Experiments via Hierarchical Hidden Markov Models

    PubMed Central

    Chen, Yang; Shen, Kuang

    2017-01-01

    To maintain proper cellular functions, over 50% of proteins encoded in the genome need to be transported to cellular membranes. The molecular mechanism behind such a process, often referred to as protein targeting, is not well understood. Single-molecule experiments are designed to unveil the detailed mechanisms and reveal the functions of different molecular machineries involved in the process. The experimental data consist of hundreds of stochastic time traces from the fluorescence recordings of the experimental system. We introduce a Bayesian hierarchical model on top of hidden Markov models (HMMs) to analyze these data and use the statistical results to answer the biological questions. In addition to resolving the biological puzzles and delineating the regulating roles of different molecular complexes, our statistical results enable us to propose a more detailed mechanism for the late stages of the protein targeting process. PMID:28943680

  14. Predicting the stochastic guiding of kinesin-driven microtubules in microfabricated tracks: a statistical-mechanics-based modeling approach.

    PubMed

    Lin, Chih-Tin; Meyhofer, Edgar; Kurabayashi, Katsuo

    2010-01-01

    Directional control of microtubule shuttles via microfabricated tracks is key to the development of controlled nanoscale mass transport by kinesin motor molecules. Here we develop and test a model to quantitatively predict the stochastic behavior of microtubule guiding when they mechanically collide with the sidewalls of lithographically patterned tracks. By taking into account appropriate probability distributions of microscopic states of the microtubule system, the model allows us to theoretically analyze the roles of collision conditions and kinesin surface densities in determining how the motion of microtubule shuttles is controlled. In addition, we experimentally observe the statistics of microtubule collision events and compare our theoretical prediction with experimental data to validate our model. The model will direct the design of future hybrid nanotechnology devices that integrate nanoscale transport systems powered by kinesin-driven molecular shuttles.

  15. Statistical Mechanics Model of Solids with Defects

    NASA Astrophysics Data System (ADS)

    Kaufman, M.; Walters, P. A.; Ferrante, J.

    1997-03-01

    Previously(M.Kaufman, J.Ferrante,NASA Tech. Memor.,1996), we examined the phase diagram for the failure of a solid under isotropic expansion and compression as a function of stress and temperature with the "springs" modelled by the universal binding energy relation (UBER)(J.H.Rose, J.R.Smith, F.Guinea, J.Ferrante, Phys.Rev.B29, 2963 (1984)). In the previous calculation we assumed that the "springs" failed independently and that the strain is uniform. In the present work, we have extended this statistical model of mechanical failure by allowing for correlations between "springs" and for thermal fluctuations in strains. The springs are now modelled in the harmonic approximation with a failure threshold energy E0, as an intermediate step in future studies to reinclude the full non-linear dependence of the UBER for modelling the interactions. We use the Migdal-Kadanoff renormalization-group method to determine the phase diagram of the model and to compute the free energy.

  16. Manifold parametrization of the left ventricle for a statistical modelling of its complete anatomy

    NASA Astrophysics Data System (ADS)

    Gil, D.; Garcia-Barnes, J.; Hernández-Sabate, A.; Marti, E.

    2010-03-01

    Distortion of Left Ventricle (LV) external anatomy is related to some dysfunctions, such as hypertrophy. The architecture of myocardial fibers determines LV electromechanical activation patterns as well as mechanics. Thus, their joined modelling would allow the design of specific interventions (such as peacemaker implantation and LV remodelling) and therapies (such as resynchronization). On one hand, accurate modelling of external anatomy requires either a dense sampling or a continuous infinite dimensional approach, which requires non-Euclidean statistics. On the other hand, computation of fiber models requires statistics on Riemannian spaces. Most approaches compute separate statistical models for external anatomy and fibers architecture. In this work we propose a general mathematical framework based on differential geometry concepts for computing a statistical model including, both, external and fiber anatomy. Our framework provides a continuous approach to external anatomy supporting standard statistics. We also provide a straightforward formula for the computation of the Riemannian fiber statistics. We have applied our methodology to the computation of complete anatomical atlas of canine hearts from diffusion tensor studies. The orientation of fibers over the average external geometry agrees with the segmental description of orientations reported in the literature.

  17. Statistical mechanics explanation for the structure of ocean eddies and currents

    NASA Astrophysics Data System (ADS)

    Venaille, A.; Bouchet, F.

    2010-12-01

    The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.

  18. A multibody knee model with discrete cartilage prediction of tibio-femoral contact mechanics.

    PubMed

    Guess, Trent M; Liu, Hongzeng; Bhashyam, Sampath; Thiagarajan, Ganesh

    2013-01-01

    Combining musculoskeletal simulations with anatomical joint models capable of predicting cartilage contact mechanics would provide a valuable tool for studying the relationships between muscle force and cartilage loading. As a step towards producing multibody musculoskeletal models that include representation of cartilage tissue mechanics, this research developed a subject-specific multibody knee model that represented the tibia plateau cartilage as discrete rigid bodies that interacted with the femur through deformable contacts. Parameters for the compliant contact law were derived using three methods: (1) simplified Hertzian contact theory, (2) simplified elastic foundation contact theory and (3) parameter optimisation from a finite element (FE) solution. The contact parameters and contact friction were evaluated during a simulated walk in a virtual dynamic knee simulator, and the resulting kinematics were compared with measured in vitro kinematics. The effects on predicted contact pressures and cartilage-bone interface shear forces during the simulated walk were also evaluated. The compliant contact stiffness parameters had a statistically significant effect on predicted contact pressures as well as all tibio-femoral motions except flexion-extension. The contact friction was not statistically significant to contact pressures, but was statistically significant to medial-lateral translation and all rotations except flexion-extension. The magnitude of kinematic differences between model formulations was relatively small, but contact pressure predictions were sensitive to model formulation. The developed multibody knee model was computationally efficient and had a computation time 283 times faster than a FE simulation using the same geometries and boundary conditions.

  19. On modelling the interaction between two rotating bodies with statistically distributed features: an application to dressing of grinding wheels

    NASA Astrophysics Data System (ADS)

    Spampinato, A.; Axinte, D. A.

    2017-12-01

    The mechanisms of interaction between bodies with statistically arranged features present characteristics common to different abrasive processes, such as dressing of abrasive tools. In contrast with the current empirical approach used to estimate the results of operations based on attritive interactions, the method we present in this paper allows us to predict the output forces and the topography of a simulated grinding wheel for a set of specific operational parameters (speed ratio and radial feed-rate), providing a thorough understanding of the complex mechanisms regulating these processes. In modelling the dressing mechanisms, the abrasive characteristics of both bodies (grain size, geometry, inter-space and protrusion) are first simulated; thus, their interaction is simulated in terms of grain collisions. Exploiting a specifically designed contact/impact evaluation algorithm, the model simulates the collisional effects of the dresser abrasives on the grinding wheel topography (grain fracture/break-out). The method has been tested for the case of a diamond rotary dresser, predicting output forces within less than 10% error and obtaining experimentally validated grinding wheel topographies. The study provides a fundamental understanding of the dressing operation, enabling the improvement of its performance in an industrial scenario, while being of general interest in modelling collision-based processes involving statistically distributed elements.

  20. Effective temperature in an interacting vertex system: theory and experiment on artificial spin ice.

    PubMed

    Nisoli, Cristiano; Li, Jie; Ke, Xianglin; Garand, D; Schiffer, Peter; Crespi, Vincent H

    2010-07-23

    Frustrated arrays of interacting single-domain nanomagnets provide important model systems for statistical mechanics, as they map closely onto well-studied vertex models and are amenable to direct imaging and custom engineering. Although these systems are manifestly athermal, we demonstrate that an effective temperature, controlled by an external magnetic drive, describes their microstates and therefore their full statistical properties.

  1. Physics of Electronic Materials

    NASA Astrophysics Data System (ADS)

    Rammer, Jørgen

    2017-03-01

    1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.

  2. Statistical mechanics of competitive resource allocation using agent-based models

    NASA Astrophysics Data System (ADS)

    Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.

    2015-01-01

    Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.

  3. Study of components and statistical reaction mechanism in simulation of nuclear process for optimized production of {sup 64}Cu and {sup 67}Ga medical radioisotopes using TALYS, EMPIRE and LISE++ nuclear reaction and evaporation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasrabadi, M. N., E-mail: mnnasrabadi@ast.ui.ac.ir; Sepiani, M.

    2015-03-30

    Production of medical radioisotopes is one of the most important tasks in the field of nuclear technology. These radioactive isotopes are mainly produced through variety nuclear process. In this research, excitation functions and nuclear reaction mechanisms are studied for simulation of production of these radioisotopes in the TALYS, EMPIRE and LISE++ reaction codes, then parameters and different models of nuclear level density as one of the most important components in statistical reaction models are adjusted for optimum production of desired radioactive yields.

  4. Study of components and statistical reaction mechanism in simulation of nuclear process for optimized production of 64Cu and 67Ga medical radioisotopes using TALYS, EMPIRE and LISE++ nuclear reaction and evaporation codes

    NASA Astrophysics Data System (ADS)

    Nasrabadi, M. N.; Sepiani, M.

    2015-03-01

    Production of medical radioisotopes is one of the most important tasks in the field of nuclear technology. These radioactive isotopes are mainly produced through variety nuclear process. In this research, excitation functions and nuclear reaction mechanisms are studied for simulation of production of these radioisotopes in the TALYS, EMPIRE & LISE++ reaction codes, then parameters and different models of nuclear level density as one of the most important components in statistical reaction models are adjusted for optimum production of desired radioactive yields.

  5. Catalytic conversion reactions in nanoporous systems with concentration-dependent selectivity: Statistical mechanical modeling

    DOE PAGES

    Garcia, Andres; Wang, Jing; Windus, Theresa L.; ...

    2016-05-20

    Statistical mechanical modeling is developed to describe a catalytic conversion reaction A → B c or B t with concentration-dependent selectivity of the products, B c or B t, where reaction occurs inside catalytic particles traversed by narrow linear nanopores. The associated restricted diffusive transport, which in the extreme case is described by single-file diffusion, naturally induces strong concentration gradients. Hence, by comparing kinetic Monte Carlo simulation results with analytic treatments, selectivity is shown to be impacted by strong spatial correlations induced by restricted diffusivity in the presence of reaction and also by a subtle clustering of reactants, A.

  6. Statistical Mechanics of Node-perturbation Learning with Noisy Baseline

    NASA Astrophysics Data System (ADS)

    Hara, Kazuyuki; Katahira, Kentaro; Okada, Masato

    2017-02-01

    Node-perturbation learning is a type of statistical gradient descent algorithm that can be applied to problems where the objective function is not explicitly formulated, including reinforcement learning. It estimates the gradient of an objective function by using the change in the object function in response to the perturbation. The value of the objective function for an unperturbed output is called a baseline. Cho et al. proposed node-perturbation learning with a noisy baseline. In this paper, we report on building the statistical mechanics of Cho's model and on deriving coupled differential equations of order parameters that depict learning dynamics. We also show how to derive the generalization error by solving the differential equations of order parameters. On the basis of the results, we show that Cho's results are also apply in general cases and show some general performances of Cho's model.

  7. New statistical potential for quality assessment of protein models and a survey of energy functions

    PubMed Central

    2010-01-01

    Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048

  8. Addressing the statistical mechanics of planet orbits in the solar system

    NASA Astrophysics Data System (ADS)

    Mogavero, Federico

    2017-10-01

    The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.

  9. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  10. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  11. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    NASA Astrophysics Data System (ADS)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  12. Inverse tissue mechanics of cell monolayer expansion.

    PubMed

    Kondo, Yohei; Aoki, Kazuhiro; Ishii, Shin

    2018-03-01

    Living tissues undergo deformation during morphogenesis. In this process, cells generate mechanical forces that drive the coordinated cell motion and shape changes. Recent advances in experimental and theoretical techniques have enabled in situ measurement of the mechanical forces, but the characterization of mechanical properties that determine how these forces quantitatively affect tissue deformation remains challenging, and this represents a major obstacle for the complete understanding of morphogenesis. Here, we proposed a non-invasive reverse-engineering approach for the estimation of the mechanical properties, by combining tissue mechanics modeling and statistical machine learning. Our strategy is to model the tissue as a continuum mechanical system and to use passive observations of spontaneous tissue deformation and force fields to statistically estimate the model parameters. This method was applied to the analysis of the collective migration of Madin-Darby canine kidney cells, and the tissue flow and force were simultaneously observed by the phase contrast imaging and traction force microscopy. We found that our monolayer elastic model, whose elastic moduli were reverse-engineered, enabled a long-term forecast of the traction force fields when given the tissue flow fields, indicating that the elasticity contributes to the evolution of the tissue stress. Furthermore, we investigated the tissues in which myosin was inhibited by blebbistatin treatment, and observed a several-fold reduction in the elastic moduli. The obtained results validate our framework, which paves the way to the estimation of mechanical properties of living tissues during morphogenesis.

  13. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  14. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  15. Modeling Cell Size Regulation: From Single-Cell-Level Statistics to Molecular Mechanisms and Population-Level Effects.

    PubMed

    Ho, Po-Yi; Lin, Jie; Amir, Ariel

    2018-05-20

    Most microorganisms regulate their cell size. In this article, we review some of the mathematical formulations of the problem of cell size regulation. We focus on coarse-grained stochastic models and the statistics that they generate. We review the biologically relevant insights obtained from these models. We then describe cell cycle regulation and its molecular implementations, protein number regulation, and population growth, all in relation to size regulation. Finally, we discuss several future directions for developing understanding beyond phenomenological models of cell size regulation.

  16. Analysis of Longitudinal Outcome Data with Missing Values in Total Knee Arthroplasty.

    PubMed

    Kang, Yeon Gwi; Lee, Jang Taek; Kang, Jong Yeal; Kim, Ga Hye; Kim, Tae Kyun

    2016-01-01

    We sought to determine the influence of missing data on the statistical results, and to determine which statistical method is most appropriate for the analysis of longitudinal outcome data of TKA with missing values among repeated measures ANOVA, generalized estimating equation (GEE) and mixed effects model repeated measures (MMRM). Data sets with missing values were generated with different proportion of missing data, sample size and missing-data generation mechanism. Each data set was analyzed with three statistical methods. The influence of missing data was greater with higher proportion of missing data and smaller sample size. MMRM tended to show least changes in the statistics. When missing values were generated by 'missing not at random' mechanism, no statistical methods could fully avoid deviations in the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Development of failure model for nickel cadmium cells

    NASA Technical Reports Server (NTRS)

    Gupta, A.

    1980-01-01

    The development of a method for the life prediction of nickel cadmium cells is discussed. The approach described involves acquiring an understanding of the mechanisms of degradation and failure and at the same time developing nondestructive evaluation techniques for the nickel cadmium cells. The development of a statistical failure model which will describe the mechanisms of degradation and failure is outlined.

  18. Statistical model for the mechanical behavior of the tissue engineering non-woven fibrous matrices under large deformation.

    PubMed

    Rizvi, Mohd Suhail; Pal, Anupam

    2014-09-01

    The fibrous matrices are widely used as scaffolds for the regeneration of load-bearing tissues due to their structural and mechanical similarities with the fibrous components of the extracellular matrix. These scaffolds not only provide the appropriate microenvironment for the residing cells but also act as medium for the transmission of the mechanical stimuli, essential for the tissue regeneration, from macroscopic scale of the scaffolds to the microscopic scale of cells. The requirement of the mechanical loading for the tissue regeneration requires the fibrous scaffolds to be able to sustain the complex three-dimensional mechanical loading conditions. In order to gain insight into the mechanical behavior of the fibrous matrices under large amount of elongation as well as shear, a statistical model has been formulated to study the macroscopic mechanical behavior of the electrospun fibrous matrix and the transmission of the mechanical stimuli from scaffolds to the cells via the constituting fibers. The study establishes the load-deformation relationships for the fibrous matrices for different structural parameters. It also quantifies the changes in the fiber arrangement and tension generated in the fibers with the deformation of the matrix. The model reveals that the tension generated in the fibers on matrix deformation is not homogeneous and hence the cells located in different regions of the fibrous scaffold might experience different mechanical stimuli. The mechanical response of fibrous matrices was also found to be dependent on the aspect ratio of the matrix. Therefore, the model establishes a structure-mechanics interdependence of the fibrous matrices under large deformation, which can be utilized in identifying the appropriate structure and external mechanical loading conditions for the regeneration of load-bearing tissues. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Stochastic modelling, Bayesian inference, and new in vivo measurements elucidate the debated mtDNA bottleneck mechanism

    PubMed Central

    Johnston, Iain G; Burgstaller, Joerg P; Havlicek, Vitezslav; Kolbe, Thomas; Rülicke, Thomas; Brem, Gottfried; Poulton, Jo; Jones, Nick S

    2015-01-01

    Dangerous damage to mitochondrial DNA (mtDNA) can be ameliorated during mammalian development through a highly debated mechanism called the mtDNA bottleneck. Uncertainty surrounding this process limits our ability to address inherited mtDNA diseases. We produce a new, physically motivated, generalisable theoretical model for mtDNA populations during development, allowing the first statistical comparison of proposed bottleneck mechanisms. Using approximate Bayesian computation and mouse data, we find most statistical support for a combination of binomial partitioning of mtDNAs at cell divisions and random mtDNA turnover, meaning that the debated exact magnitude of mtDNA copy number depletion is flexible. New experimental measurements from a wild-derived mtDNA pairing in mice confirm the theoretical predictions of this model. We analytically solve a mathematical description of this mechanism, computing probabilities of mtDNA disease onset, efficacy of clinical sampling strategies, and effects of potential dynamic interventions, thus developing a quantitative and experimentally-supported stochastic theory of the bottleneck. DOI: http://dx.doi.org/10.7554/eLife.07464.001 PMID:26035426

  20. Using Artificial Neural Networks in Educational Research: Some Comparisons with Linear Statistical Models.

    ERIC Educational Resources Information Center

    Everson, Howard T.; And Others

    This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…

  1. Improving Markov Chain Models for Road Profiles Simulation via Definition of States

    DTIC Science & Technology

    2012-04-01

    wavelet transform in pavement profile analysis," Vehicle System Dynamics: International Journal of Vehicle Mechanics and Mobility, vol. 47, no. 4...34Estimating Markov Transition Probabilities from Micro -Unit Data," Journal of the Royal Statistical Society. Series C (Applied Statistics), pp. 355-371

  2. A multicenter mortality prediction model for patients receiving prolonged mechanical ventilation

    PubMed Central

    Carson, Shannon S.; Kahn, Jeremy M.; Hough, Catherine L.; Seeley, Eric J.; White, Douglas B.; Douglas, Ivor S.; Cox, Christopher E.; Caldwell, Ellen; Bangdiwala, Shrikant I.; Garrett, Joanne M.; Rubenfeld, Gordon D.

    2012-01-01

    Objective Significant deficiencies exist in the communication of prognosis for patients requiring prolonged mechanical ventilation after acute illness, in part because of clinician uncertainty about long-term outcomes. We sought to refine a mortality prediction model for patients requiring prolonged ventilation using a multicentered study design. Design Cohort study. Setting Five geographically diverse tertiary care medical centers in the United States (California, Colorado, North Carolina, Pennsylvania, Washington). Patients Two hundred sixty adult patients who received at least 21 days of mechanical ventilation after acute illness. Interventions None. Measurements and Main Results For the probability model, we included age, platelet count, and requirement for vasopressors and/or hemodialysis, each measured on day 21 of mechanical ventilation, in a logistic regression model with 1-yr mortality as the outcome variable. We subsequently modified a simplified prognostic scoring rule (ProVent score) by categorizing the risk variables (age 18–49, 50–64, and >65 yrs; platelet count 0–150 and >150; vasopressors; hemodialysis) in another logistic regression model and assigning points to variables according to β coefficient values. Overall mortality at 1 yr was 48%. The area under the curve of the receiver operator characteristic curve for the primary ProVent probability model was 0.79 (95% confidence interval, 0.75–0.81), and the p value for the Hosmer-Lemeshow goodness-of-fit statistic was .89. The area under the curve for the categorical model was 0.77, and the p value for the goodness-of-fit statistic was .34. The area under the curve for the ProVent score was 0.76, and the p value for the Hosmer-Lemeshow goodness-of-fit statistic was .60. For the 50 patients with a ProVent score >2, only one patient was able to be discharged directly home, and 1-yr mortality was 86%. Conclusion The ProVent probability model is a simple and reproducible model that can accurately identify patients requiring prolonged mechanical ventilation who are at high risk of 1-yr mortality. PMID:22080643

  3. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems

    NASA Astrophysics Data System (ADS)

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.

  4. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion.

    PubMed

    Gautestad, Arild O

    2012-09-07

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.

  5. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems.

    PubMed

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.

  6. A Three Dimensional Kinematic and Kinetic Study of the Golf Swing

    PubMed Central

    Nesbit, Steven M.

    2005-01-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key Points Full-body model of the golf swing. Mechanical description of the golf swing. Statistical analysis of golf swing mechanics. Comparisons of subject swing mechanics PMID:24627665

  7. A three dimensional kinematic and kinetic study of the golf swing.

    PubMed

    Nesbit, Steven M

    2005-12-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key PointsFull-body model of the golf swing.Mechanical description of the golf swing.Statistical analysis of golf swing mechanics.Comparisons of subject swing mechanics.

  8. Controlling reactivity of nanoporous catalyst materials by tuning reaction product-pore interior interactions: Statistical mechanical modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Ackerman, David M.; Lin, Victor S.-Y.

    2013-04-02

    Statistical mechanical modeling is performed of a catalytic conversion reaction within a functionalized nanoporous material to assess the effect of varying the reaction product-pore interior interaction from attractive to repulsive. A strong enhancement in reactivity is observed not just due to the shift in reaction equilibrium towards completion but also due to enhanced transport within the pore resulting from reduced loading. The latter effect is strongest for highly restricted transport (single-file diffusion), and applies even for irreversible reactions. The analysis is performed utilizing a generalized hydrodynamic formulation of the reaction-diffusion equations which can reliably capture the complex interplay between reactionmore » and restricted transport.« less

  9. Context-Aware Generative Adversarial Privacy

    NASA Astrophysics Data System (ADS)

    Huang, Chong; Kairouz, Peter; Chen, Xiao; Sankar, Lalitha; Rajagopal, Ram

    2017-12-01

    Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP). GAP leverages recent advancements in generative adversarial networks (GANs) to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals' private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP's performance, we investigate two simple (yet canonical) statistical dataset models: (a) the binary data model, and (b) the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion) match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.

  10. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  11. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  12. Prediction of Chemical Function: Model Development and Application

    EPA Science Inventory

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...

  13. Theory-based Bayesian Models of Inductive Inference

    DTIC Science & Technology

    2010-07-19

    Subjective randomness and natural scene statistics. Psychonomic Bulletin & Review . http://cocosci.berkeley.edu/tom/papers/randscenes.pdf Page 1...in press). Exemplar models as a mechanism for performing Bayesian inference. Psychonomic Bulletin & Review . http://cocosci.berkeley.edu/tom

  14. Modeling property evolution of container materials used in nuclear waste storage

    NASA Astrophysics Data System (ADS)

    Li, Dongsheng; Garmestani, Hamid; Khaleel, Moe; Sun, Xin

    2010-03-01

    Container materials under irradiation for a long time will raise high energy in the structure to generate critical structural damage. This study investigated what kind of mesoscale microstructure will be more resistant to radiation damage. Mechanical properties evolution during irradiation was modeled using statistical continuum mechanics. Preliminary results also showed how to achieve the desired microstructure with higher resistance to radiation.

  15. What You Learn is What You See: Using Eye Movements to Study Infant Cross-Situational Word Learning

    PubMed Central

    Smith, Linda

    2016-01-01

    Recent studies show that both adults and young children possess powerful statistical learning capabilities to solve the word-to-world mapping problem. However, the underlying mechanisms that make statistical learning possible and powerful are not yet known. With the goal of providing new insights into this issue, the research reported in this paper used an eye tracker to record the moment-by-moment eye movement data of 14-month-old babies in statistical learning tasks. Various measures are applied to such fine-grained temporal data, such as looking duration and shift rate (the number of shifts in gaze from one visual object to the other) trial by trial, showing different eye movement patterns between strong and weak statistical learners. Moreover, an information-theoretic measure is developed and applied to gaze data to quantify the degree of learning uncertainty trial by trial. Next, a simple associative statistical learning model is applied to eye movement data and these simulation results are compared with empirical results from young children, showing strong correlations between these two. This suggests that an associative learning mechanism with selective attention can provide a cognitively plausible model of cross-situational statistical learning. The work represents the first steps to use eye movement data to infer underlying real-time processes in statistical word learning. PMID:22213894

  16. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    NASA Astrophysics Data System (ADS)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  17. Modeling the microstructurally dependent mechanical properties of poly(ester-urethane-urea)s.

    PubMed

    Warren, P Daniel; Sycks, Dalton G; McGrath, Dominic V; Vande Geest, Jonathan P

    2013-12-01

    Poly(ester-urethane-urea) (PEUU) is one of many synthetic biodegradable elastomers under scrutiny for biomedical and soft tissue applications. The goal of this study was to investigate the effect of the experimental parameters on mechanical properties of PEUUs following exposure to different degrading environments, similar to that of the human body, using linear regression, producing one predictive model. The model utilizes two independent variables of poly(caprolactone) (PCL) type and copolymer crystallinity to predict the dependent variable of maximum tangential modulus (MTM). Results indicate that comparisons between PCLs at different degradation states are statistically different (p < 0.0003), while the difference between experimental and predicted average MTM is statistically negligible (p < 0.02). The linear correlation between experimental and predicted MTM values is R(2) = 0.75. Copyright © 2013 Wiley Periodicals, Inc., a Wiley Company.

  18. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  19. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  20. A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part I-Theory

    NASA Astrophysics Data System (ADS)

    Tengattini, Alessandro; Das, Arghya; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

    2014-10-01

    This is the first of two papers introducing a novel thermomechanical continuum constitutive model for cemented granular materials. Here, we establish the theoretical foundations of the model, and highlight its novelties. At the limit of no cement, the model is fully consistent with the original Breakage Mechanics model. An essential ingredient of the model is the use of measurable and micro-mechanics based internal variables, describing the evolution of the dominant inelastic processes. This imposes a link between the macroscopic mechanical behavior and the statistically averaged evolution of the microstructure. As a consequence this model requires only a few physically identifiable parameters, including those of the original breakage model and new ones describing the cement: its volume fraction, its critical damage energy and bulk stiffness, and the cohesion.

  1. Statistical mechanical model of gas adsorption in porous crystals with dynamic moieties

    PubMed Central

    Braun, Efrem; Carraro, Carlo; Smit, Berend

    2017-01-01

    Some nanoporous, crystalline materials possess dynamic constituents, for example, rotatable moieties. These moieties can undergo a conformation change in response to the adsorption of guest molecules, which qualitatively impacts adsorption behavior. We pose and solve a statistical mechanical model of gas adsorption in a porous crystal whose cages share a common ligand that can adopt two distinct rotational conformations. Guest molecules incentivize the ligands to adopt a different rotational configuration than maintained in the empty host. Our model captures inflections, steps, and hysteresis that can arise in the adsorption isotherm as a signature of the rotating ligands. The insights disclosed by our simple model contribute a more intimate understanding of the response and consequence of rotating ligands integrated into porous materials to harness them for gas storage and separations, chemical sensing, drug delivery, catalysis, and nanoscale devices. Particularly, our model reveals design strategies to exploit these moving constituents and engineer improved adsorbents with intrinsic thermal management for pressure-swing adsorption processes. PMID:28049851

  2. Statistical mechanical model of gas adsorption in porous crystals with dynamic moieties.

    PubMed

    Simon, Cory M; Braun, Efrem; Carraro, Carlo; Smit, Berend

    2017-01-17

    Some nanoporous, crystalline materials possess dynamic constituents, for example, rotatable moieties. These moieties can undergo a conformation change in response to the adsorption of guest molecules, which qualitatively impacts adsorption behavior. We pose and solve a statistical mechanical model of gas adsorption in a porous crystal whose cages share a common ligand that can adopt two distinct rotational conformations. Guest molecules incentivize the ligands to adopt a different rotational configuration than maintained in the empty host. Our model captures inflections, steps, and hysteresis that can arise in the adsorption isotherm as a signature of the rotating ligands. The insights disclosed by our simple model contribute a more intimate understanding of the response and consequence of rotating ligands integrated into porous materials to harness them for gas storage and separations, chemical sensing, drug delivery, catalysis, and nanoscale devices. Particularly, our model reveals design strategies to exploit these moving constituents and engineer improved adsorbents with intrinsic thermal management for pressure-swing adsorption processes.

  3. Modeling of Pedestrian Flows Using Hybrid Models of Euler Equations and Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Bärwolff, Günter; Slawig, Thomas; Schwandt, Hartmut

    2007-09-01

    In the last years various systems have been developed for controlling, planning and predicting the traffic of persons and vehicles, in particular under security aspects. Going beyond pure counting and statistical models, approaches were found to be very adequate and accurate which are based on well-known concepts originally developed in very different research areas, namely continuum mechanics and computer science. In the present paper, we outline a continuum mechanical approach for the description of pedestrain flow.

  4. The Quantum and Fluid Mechanics of Global Warming

    NASA Astrophysics Data System (ADS)

    Marston, Brad

    2008-03-01

    Quantum physics and fluid mechanics are the foundation of any understanding of the Earth's climate. In this talk I invoke three well-known aspects of quantum mechanics to explore what will happen as the concentrations of greenhouse gases such as carbon dioxide continue to increase. Fluid dynamical models of the Earth's atmosphere, demonstrated here in live simulations, yield further insight into past, present, and future climates. Statistics of geophysical flows can, however, be ascertained directly without recourse to numerical simulation, using concepts borrowed from nonequilibrium statistical mechanicsootnotetextJ. B. Marston, E. Conover, and Tapio Schneider, ``Statistics of an Unstable Barotropic Jet from a Cumulant Expansion,'' arXiv:0705.0011, J. Atmos. Sci. (in press).. I discuss several other ways that theoretical physics may be able to contribute to a deeper understanding of climate changeootnotetextJ. Carlson, J. Harte, G. Falkovich, J. B. Marston, and R. Pierrehumbert, ``Physics of Climate Change'' 2008 Program of the Kavli Institute for Theoretical Physics..

  5. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  6. Helium abundance and speed difference between helium ions and protons in the solar wind from coronal holes, active regions, and quiet Sun

    NASA Astrophysics Data System (ADS)

    Fu, Hui; Madjarska, M. S.; Li, Bo; Xia, LiDong; Huang, ZhengHua

    2018-05-01

    Two main models have been developed to explain the mechanisms of release, heating and acceleration of the nascent solar wind, the wave-turbulence-driven (WTD) models and reconnection-loop-opening (RLO) models, in which the plasma release processes are fundamentally different. Given that the statistical observational properties of helium ions produced in magnetically diverse solar regions could provide valuable information for the solar wind modelling, we examine the statistical properties of the helium abundance (AHe) and the speed difference between helium ions and protons (vαp) for coronal holes (CHs), active regions (ARs) and the quiet Sun (QS). We find bimodal distributions in the space of AHeand vαp/vA(where vA is the local Alfvén speed) for the solar wind as a whole. The CH wind measurements are concentrated at higher AHeand vαp/vAvalues with a smaller AHedistribution range, while the AR and QS wind is associated with lower AHeand vαp/vA, and a larger AHedistribution range. The magnetic diversity of the source regions and the physical processes related to it are possibly responsible for the different properties of AHeand vαp/vA. The statistical results suggest that the two solar wind generation mechanisms, WTD and RLO, work in parallel in all solar wind source regions. In CH regions WTD plays a major role, whereas the RLO mechanism is more important in AR and QS.

  7. Nonlinear sigma models with compact hyperbolic target spaces

    NASA Astrophysics Data System (ADS)

    Gubser, Steven; Saleem, Zain H.; Schoenholz, Samuel S.; Stoica, Bogdan; Stokes, James

    2016-06-01

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model [1, 2]. Unlike in the O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. The diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.

  8. Nonlinear sigma models with compact hyperbolic target spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gubser, Steven; Saleem, Zain H.; Schoenholz, Samuel S.

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model [1, 2]. Unlike in themore » O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. In conclusion, the diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.« less

  9. Nonlinear sigma models with compact hyperbolic target spaces

    DOE PAGES

    Gubser, Steven; Saleem, Zain H.; Schoenholz, Samuel S.; ...

    2016-06-23

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model [1, 2]. Unlike in themore » O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. In conclusion, the diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler; Shi, Ying; Santhanagopalan, Shriram

    Predictive models of Li-ion battery lifetime must consider a multiplicity of electrochemical, thermal, and mechanical degradation modes experienced by batteries in application environments. To complicate matters, Li-ion batteries can experience different degradation trajectories that depend on storage and cycling history of the application environment. Rates of degradation are controlled by factors such as temperature history, electrochemical operating window, and charge/discharge rate. We present a generalized battery life prognostic model framework for battery systems design and control. The model framework consists of trial functions that are statistically regressed to Li-ion cell life datasets wherein the cells have been aged under differentmore » levels of stress. Degradation mechanisms and rate laws dependent on temperature, storage, and cycling condition are regressed to the data, with multiple model hypotheses evaluated and the best model down-selected based on statistics. The resulting life prognostic model, implemented in state variable form, is extensible to arbitrary real-world scenarios. The model is applicable in real-time control algorithms to maximize battery life and performance. We discuss efforts to reduce lifetime prediction error and accommodate its inevitable impact in controller design.« less

  11. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion

    PubMed Central

    Gautestad, Arild O.

    2012-01-01

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the ‘power law in disguise’ paradox—from a composite Brownian motion consisting of a superposition of independent movement processes at different scales—may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated. PMID:22456456

  12. 10 CFR 431.17 - Determination of efficiency.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... method or methods used; the mathematical model, the engineering or statistical analysis, computer... accordance with § 431.16 of this subpart, or by application of an alternative efficiency determination method... must be: (i) Derived from a mathematical model that represents the mechanical and electrical...

  13. A review of failure models for unidirectional ceramic matrix composites under monotonic loads

    NASA Technical Reports Server (NTRS)

    Tripp, David E.; Hemann, John H.; Gyekenyesi, John P.

    1989-01-01

    Ceramic matrix composites offer significant potential for improving the performance of turbine engines. In order to achieve their potential, however, improvements in design methodology are needed. In the past most components using structural ceramic matrix composites were designed by trial and error since the emphasis of feasibility demonstration minimized the development of mathematical models. To understand the key parameters controlling response and the mechanics of failure, the development of structural failure models is required. A review of short term failure models with potential for ceramic matrix composite laminates under monotonic loads is presented. Phenomenological, semi-empirical, shear-lag, fracture mechanics, damage mechanics, and statistical models for the fast fracture analysis of continuous fiber unidirectional ceramic matrix composites under monotonic loads are surveyed.

  14. Statistical mechanics of two-dimensional shuffled foams: prediction of the correlation between geometry and topology.

    PubMed

    Durand, Marc; Käfer, Jos; Quilliet, Catherine; Cox, Simon; Talebi, Shirin Ataei; Graner, François

    2011-10-14

    We propose an analytical model for the statistical mechanics of shuffled two-dimensional foams with moderate bubble size polydispersity. It predicts without any adjustable parameters the correlations between the number of sides n of the bubbles (topology) and their areas A (geometry) observed in experiments and numerical simulations of shuffled foams. Detailed statistics show that in shuffled cellular patterns n correlates better with √A (as claimed by Desch and Feltham) than with A (as claimed by Lewis and widely assumed in the literature). At the level of the whole foam, standard deviations Δn and ΔA are in proportion. Possible applications include correlations of the detailed distributions of n and A, three-dimensional foams, and biological tissues.

  15. Detecting temperature fluctuations at equilibrium.

    PubMed

    Dixit, Purushottam D

    2015-05-21

    The Gibbs and the Boltzmann definition of temperature agree only in the macroscopic limit. The ambiguity in identifying the equilibrium temperature of a finite-sized 'small' system exchanging energy with a bath is usually understood as a limitation of conventional statistical mechanics. We interpret this ambiguity as resulting from a stochastically fluctuating temperature coupled with the phase space variables giving rise to a broad temperature distribution. With this ansatz, we develop the equilibrium statistics and dynamics of small systems. Numerical evidence using an analytically tractable model shows that the effects of temperature fluctuations can be detected in the equilibrium and dynamical properties of the phase space of the small system. Our theory generalizes statistical mechanics to small systems relevant in biophysics and nanotechnology.

  16. Complex patterns of abnormal heartbeats

    NASA Technical Reports Server (NTRS)

    Schulte-Frohlinde, Verena; Ashkenazy, Yosef; Goldberger, Ary L.; Ivanov, Plamen Ch; Costa, Madalena; Morley-Davies, Adrian; Stanley, H. Eugene; Glass, Leon

    2002-01-01

    Individuals having frequent abnormal heartbeats interspersed with normal heartbeats may be at an increased risk of sudden cardiac death. However, mechanistic understanding of such cardiac arrhythmias is limited. We present a visual and qualitative method to display statistical properties of abnormal heartbeats. We introduce dynamical "heartprints" which reveal characteristic patterns in long clinical records encompassing approximately 10(5) heartbeats and may provide information about underlying mechanisms. We test if these dynamics can be reproduced by model simulations in which abnormal heartbeats are generated (i) randomly, (ii) at a fixed time interval following a preceding normal heartbeat, or (iii) by an independent oscillator that may or may not interact with the normal heartbeat. We compare the results of these three models and test their limitations to comprehensively simulate the statistical features of selected clinical records. This work introduces methods that can be used to test mathematical models of arrhythmogenesis and to develop a new understanding of underlying electrophysiologic mechanisms of cardiac arrhythmia.

  17. Statistical Physics on the Eve of the 21st Century: in Honour of J B McGuire on the Occasion of His 65th Birthday

    NASA Astrophysics Data System (ADS)

    Batchelor, Murray T.; Wille, Luc T.

    The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in the Star-Triangle Relations * A Self-Avoiding Walk Through Exactly Solved Lattice Models in Statistical Mechanics

  18. Statistical label fusion with hierarchical performance models

    PubMed Central

    Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.

    2014-01-01

    Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809

  19. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  20. A rigidity transition and glassy dynamics in a model for confluent 3D tissues

    NASA Astrophysics Data System (ADS)

    Merkel, Matthias; Manning, M. Lisa

    The origin of rigidity in disordered materials is an outstanding open problem in statistical physics. Recently, a new type of rigidity transition was discovered in a family of models for 2D biological tissues, but the mechanisms responsible for rigidity remain unclear. This is not just a statistical physics problem, but also relevant for embryonic development, cancer growth, and wound healing. To gain insight into this rigidity transition and make new predictions about biological bulk tissues, we have developed a fully 3D self-propelled Voronoi (SPV) model. The model takes into account shape, elasticity, and self-propelled motion of the individual cells. We find that in the absence of self-propulsion, this model exhibits a rigidity transition that is controlled by a dimensionless model parameter describing the preferred cell shape, with an accompanying structural order parameter. In the presence of self-propulsion, the rigidity transition appears as a glass-like transition featuring caging and aging effects. Given the similarities between this transition and jamming in particulate solids, it is natural to ask if the two transitions are related. By comparing statistics of Voronoi geometries, we show the transitions are surprisingly close but demonstrably distinct. Furthermore, an index theorem used to identify topologically protected mechanical modes in jammed systems can be extended to these vertex-type models. In our model, residual stresses govern the transition and enter the index theorem in a different way compared to jammed particles, suggesting the origin of rigidity may be different between the two.

  1. Psychological Pathways Linking Social Support to Health Outcomes: A Visit with the “Ghosts” of Research Past, Present, and Future

    PubMed Central

    Uchino, Bert N.; Bowen, Kimberly; Carlisle, McKenzie; Birmingham, Wendy

    2012-01-01

    Contemporary models postulate the importance of psychological mechanisms linking perceived and received social support to physical health outcomes. In this review, we examine studies that directly tested the potential psychological mechanisms responsible for links between social support and health-relevant physiological processes (1980s to 2010). Inconsistent with existing theoretical models, no evidence was found that psychological mechanisms such as depression, perceived stress, and other affective processes are directly responsible for links between support and health. We discuss the importance of considering statistical/design issues, emerging conceptual perspectives, and limitations of our existing models for future research aimed at elucidating the psychological mechanisms responsible for links between social support and physical health outcomes. PMID:22326104

  2. Statistical mechanical models for dissociative adsorption of O2 on metal(100) surfaces with blocking, steering, and funneling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, James W.; Liu, Da-Jiang

    We develop statistical mechanical models amenable to analytic treatment for the dissociative adsorption of O2 at hollow sites on fcc(100) metal surfaces. The models incorporate exclusion of nearest-neighbor pairs of adsorbed O. However, corresponding simple site-blocking models, where adsorption requires a large ensemble of available sites, exhibit an anomalously fast initial decrease in sticking. Thus, in addition to blocking, our models also incorporate more facile adsorption via orientational steering and funneling dynamics (features supported by ab initio Molecular Dynamics studies). Behavior for equilibrated adlayers is distinct from those with finite adspecies mobility. We focus on the low-temperature limited-mobility regime wheremore » analysis of the associated master equations readily produces exact results for both short- and long-time behavior. Kinetic Monte Carlo simulation is also utilized to provide a more complete picture of behavior. These models capture both the initial decrease and the saturation of the experimentally observed sticking versus coverage, as well as features of non-equilibrium adlayer ordering as assessed by surface-sensitive diffraction.« less

  3. Statistical mechanical models for dissociative adsorption of O{sub 2} on metal(100) surfaces with blocking, steering, and funneling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, James W.; Department of Physics and Astronomy, Iowa State University, Ames, Iowa 50011; Liu, Da-Jiang

    We develop statistical mechanical models amenable to analytic treatment for the dissociative adsorption of O{sub 2} at hollow sites on fcc(100) metal surfaces. The models incorporate exclusion of nearest-neighbor pairs of adsorbed O. However, corresponding simple site-blocking models, where adsorption requires a large ensemble of available sites, exhibit an anomalously fast initial decrease in sticking. Thus, in addition to blocking, our models also incorporate more facile adsorption via orientational steering and funneling dynamics (features supported by ab initio Molecular Dynamics studies). Behavior for equilibrated adlayers is distinct from those with finite adspecies mobility. We focus on the low-temperature limited-mobility regimemore » where analysis of the associated master equations readily produces exact results for both short- and long-time behavior. Kinetic Monte Carlo simulation is also utilized to provide a more complete picture of behavior. These models capture both the initial decrease and the saturation of the experimentally observed sticking versus coverage, as well as features of non-equilibrium adlayer ordering as assessed by surface-sensitive diffraction.« less

  4. The Runners and Injury Longitudinal Study: Injury Recovery Supplement (TRAILS_IR)

    DTIC Science & Technology

    2013-08-01

    2) develop statistical models that integrate biomechanical, behavioral, and psychological risk factors for injury, (3) determine the length of...Running Mechanics and Flexibility Between Runners in Minimalist and Traditional Footwear ”......14...annual meeting entitled “Differences in Running Mechanics and Flexibility between Runners in Minimalist and Traditional Footwear ”. The following

  5. Analysis of Immune Complex Structure by Statistical Mechanics and Light Scattering Techniques.

    NASA Astrophysics Data System (ADS)

    Busch, Nathan Adams

    1995-01-01

    The size and structure of immune complexes determine their behavior in the immune system. The chemical physics of the complex formation is not well understood; this is due in part to inadequate characterization of the proteins involved, and in part by lack of sufficiently well developed theoretical techniques. Understanding the complex formation will permit rational design of strategies for inhibiting tissue deposition of the complexes. A statistical mechanical model of the proteins based upon the theory of associating fluids was developed. The multipole electrostatic potential for each protein used in this study was characterized for net protein charge, dipole moment magnitude, and dipole moment direction. The binding sites, between the model antigen and antibodies, were characterized for their net surface area, energy, and position relative to the dipole moment of the protein. The equilibrium binding graphs generated with the protein statistical mechanical model compares favorably with experimental data obtained from radioimmunoassay results. The isothermal compressibility predicted by the model agrees with results obtained from dynamic light scattering. The statistical mechanics model was used to investigate association between the model antigen and selected pairs of antibodies. It was found that, in accordance to expectations from thermodynamic arguments, the highest total binding energy yielded complex distributions which were skewed to higher complex size. From examination of the simulated formation of ring structures from linear chain complexes, and from the joint shape probability surfaces, it was found that ring configurations were formed by the "folding" of linear chains until the ends are within binding distance. By comparing the single antigen/two antibody system which differ only in their respective binding site locations, it was found that binding site location influences complex size and shape distributions only when ring formation occurs. The internal potential energy of a ring complex is considerably less than that of the non-associating system; therefore the ring complexes are quite stable and show no evidence of breaking, and collapsing into smaller complexes. The ring formation will occur only in systems where the total free energy of each complex may be minimized. Thus, ring formation will occur even though entropically unfavorable conformations result if the total free energy can be minimized by doing so.

  6. Early years of Computational Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Mareschal, Michel

    2018-05-01

    Evidence that a model of hard spheres exhibits a first-order solid-fluid phase transition was provided in the late fifties by two new numerical techniques known as Monte Carlo and Molecular Dynamics. This result can be considered as the starting point of computational statistical mechanics: at the time, it was a confirmation of a counter-intuitive (and controversial) theoretical prediction by J. Kirkwood. It necessitated an intensive collaboration between the Los Alamos team, with Bill Wood developing the Monte Carlo approach, and the Livermore group, where Berni Alder was inventing Molecular Dynamics. This article tells how it happened.

  7. Neocortical dynamics at multiple scales: EEG standing waves, statistical mechanics, and physical analogs.

    PubMed

    Ingber, Lester; Nunez, Paul L

    2011-02-01

    The dynamic behavior of scalp potentials (EEG) is apparently due to some combination of global and local processes with important top-down and bottom-up interactions across spatial scales. In treating global mechanisms, we stress the importance of myelinated axon propagation delays and periodic boundary conditions in the cortical-white matter system, which is topologically close to a spherical shell. By contrast, the proposed local mechanisms are multiscale interactions between cortical columns via short-ranged non-myelinated fibers. A mechanical model consisting of a stretched string with attached nonlinear springs demonstrates the general idea. The string produces standing waves analogous to large-scale coherent EEG observed in some brain states. The attached springs are analogous to the smaller (mesoscopic) scale columnar dynamics. Generally, we expect string displacement and EEG at all scales to result from both global and local phenomena. A statistical mechanics of neocortical interactions (SMNI) calculates oscillatory behavior consistent with typical EEG, within columns, between neighboring columns via short-ranged non-myelinated fibers, across cortical regions via myelinated fibers, and also derives a string equation consistent with the global EEG model. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  9. [Mechanism study on leptin resistance in lung cancer cachexia rats treated by Xiaoyan Decoction].

    PubMed

    Zhang, Yun-Chao; Jia, Ying-Jie; Yang, Pei-Ying; Zhang, Xing; Li, Xiao-Jiang; Zhang, Ying; Zhu, Jin-Li; Sun, Yi-Yu; Chen, Jun; Duan, Hao-Guo; Guo, Hua; Li, Chao

    2014-12-01

    To study the leptin resistance mechanism of Xiaoyan Decoction (XD) in lung cancer cachexia (LCC) rats. An LCC rat model was established. Totally 40 rats were randomly divided into the normal control group, the LCC model group, the XD group, and the positive control group, 10 in each group. After LCC model was set up, rats in the LCC model group were administered with normal saline, 2 mL each time. Rats in the XD group were administered with XD at the daily dose of 2 mL. Those in the positive control group were administered with Medroxyprogesterone Acetate suspension (20 mg/kg) by gastrogavage at the daily dose of 2 mL. All medication lasted for 14 days. The general condition and tumor growth were observed. Serum levels of leptin and leptin receptor in the hypothalamus were detected using enzyme-linked immunosorbent assay. Contents of neuropeptide Y (NPY) and anorexia for genomic POMC were detected using real-time PCR technique. Serum leptin levels were lower in the LCC model group than in the normal control group with statistical significance (P < 0.05). Compared with the LCC model groups, serum leptin levels significantly increased in the XD group (P < 0.01). Leptin receptor levels in the hypothalamus increased significantly in the LCC model group (P < 0.01). Increased receptor levels in the LCC model group indicated that either XD or Medroxyprogesterone Acetate could effectively reduce levels of leptin receptor with statistical significance (P < 0.01). There was also statistical difference between the XD group and the positive control group (P < 0.05). Contents of NPY was higher in the LCC model group than in the other groups with statistical difference (P < 0.05). There was no statistical difference in NPY between the normal control group and the rest 2 treatment groups (P > 0.05). There was statistical difference in POMC between the normal control group and the LCC model group (P < 0.05). POMC could be decreased in the XD group and the positive control group with statistical significance (P < 0.05), and it was more obviously decreased in the XD group (P < 0.05). Leptin resistance existed in LCC rats. XD could increase serum leptin levels and reduce leptin receptor levels in the hypothalamus. LCC could be improved by elevating NPY contents in the hypothalamus and reducing POMC contents, promoting the appetite, and increasing food intake from the periphery pathway and the central pathway.

  10. Comparison of Artificial Neural Networks and ARIMA statistical models in simulations of target wind time series

    NASA Astrophysics Data System (ADS)

    Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas

    2015-04-01

    The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.

  11. Thermodynamic Model of Spatial Memory

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Allen, P.

    1998-03-01

    We develop and test a thermodynamic model of spatial memory. Our model is an application of statistical thermodynamics to cognitive science. It is related to applications of the statistical mechanics framework in parallel distributed processes research. Our macroscopic model allows us to evaluate an entropy associated with spatial memory tasks. We find that older adults exhibit higher levels of entropy than younger adults. Thurstone's Law of Categorical Judgment, according to which the discriminal processes along the psychological continuum produced by presentations of a single stimulus are normally distributed, is explained by using a Hooke spring model of spatial memory. We have also analyzed a nonlinear modification of the ideal spring model of spatial memory. This work is supported by NIH/NIA grant AG09282-06.

  12. Micro-mechanics of hydro-mechanical coupled processes during hydraulic fracturing in sandstone

    NASA Astrophysics Data System (ADS)

    Caulk, R.; Tomac, I.

    2017-12-01

    This contribution presents micro-mechanical study of hydraulic fracture initiation and propagation in sandstone. The Discrete Element Method (DEM) Yade software is used as a tool to model fully coupled hydro-mechanical behavior of the saturated sandstone under pressures typical for deep geo-reservoirs. Heterogeneity of sandstone strength tensile and shear parameters are introduced using statistical representation of cathodoluminiscence (CL) sandstone rock images. Weibull distribution of statistical parameter values was determined as a best match of the CL scans of sandstone grains and cement between grains. Results of hydraulic fracturing stimulation from the well bore indicate significant difference between models with the bond strengths informed from CL scans and uniform homogeneous representation of sandstone parameters. Micro-mechanical insight reveals formed hydraulic fracture typical for mode I or tensile cracking in both cases. However, the shear micro-cracks are abundant in the CL informed model while they are absent in the standard model with uniform strength distribution. Most of the mode II cracks, or shear micro-cracks, are not part of the main hydraulic fracture and occur in the near-tip and near-fracture areas. The position and occurrence of the shear micro-cracks is characterized as secondary effect which dissipates the hydraulic fracturing energy. Additionally, the shear micro-crack locations qualitatively resemble acoustic emission cloud of shear cracks frequently observed in hydraulic fracturing, and sometimes interpreted as re-activation of existing fractures. Clearly, our model does not contain pre-existing cracks and has continuous nature prior to fracturing. This observation is novel and interesting and is quantified in the paper. The shear particle contact forces field reveals significant relaxation compared to the model with uniform strength distribution.

  13. Dynamics and Statistical Mechanics of Rotating and non-Rotating Vortical Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Chjan

    Three projects were analyzed with the overall aim of developing a computational/analytical model for estimating values of the energy, angular momentum, enstrophy and total variation of fluid height at phase transitions between disordered and self-organized flow states in planetary atmospheres. It is believed that these transitions in equilibrium statistical mechanics models play a role in the construction of large-scale, stable structures including super-rotation in the Venusian atmosphere and the formation of the Great Red Spot on Jupiter. Exact solutions of the spherical energy-enstrophy models for rotating planetary atmospheres by Kac's method of steepest descent predicted phase transitions to super-rotating solid-bodymore » flows at high energy to enstrophy ratio for all planetary spins and to sub-rotating modes if the planetary spin is large enough. These canonical statistical ensembles are well-defined for the long-range energy interactions that arise from 2D fluid flows on compact oriented manifolds such as the surface of the sphere and torus. This is because in Fourier space available through Hodge theory, the energy terms are exactly diagonalizable and hence has zero range, leading to well-defined heat baths.« less

  14. Quantifying networks complexity from information geometry viewpoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felice, Domenico, E-mail: domenico.felice@unicam.it; Mancini, Stefano; INFN-Sezione di Perugia, Via A. Pascoli, I-06123 Perugia

    We consider a Gaussian statistical model whose parameter space is given by the variances of random variables. Underlying this model we identify networks by interpreting random variables as sitting on vertices and their correlations as weighted edges among vertices. We then associate to the parameter space a statistical manifold endowed with a Riemannian metric structure (that of Fisher-Rao). Going on, in analogy with the microcanonical definition of entropy in Statistical Mechanics, we introduce an entropic measure of networks complexity. We prove that it is invariant under networks isomorphism. Above all, considering networks as simplicial complexes, we evaluate this entropy onmore » simplexes and find that it monotonically increases with their dimension.« less

  15. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  16. Modeling the Development of Audiovisual Cue Integration in Speech Perception

    PubMed Central

    Getz, Laura M.; Nordeen, Elke R.; Vrabic, Sarah C.; Toscano, Joseph C.

    2017-01-01

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues. PMID:28335558

  17. Modeling the Development of Audiovisual Cue Integration in Speech Perception.

    PubMed

    Getz, Laura M; Nordeen, Elke R; Vrabic, Sarah C; Toscano, Joseph C

    2017-03-21

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues.

  18. Trends and fluctuations in the severity of interstate wars

    PubMed Central

    Clauset, Aaron

    2018-01-01

    Since 1945, there have been relatively few large interstate wars, especially compared to the preceding 30 years, which included both World Wars. This pattern, sometimes called the long peace, is highly controversial. Does it represent an enduring trend caused by a genuine change in the underlying conflict-generating processes? Or is it consistent with a highly variable but otherwise stable system of conflict? Using the empirical distributions of interstate war sizes and onset times from 1823 to 2003, we parameterize stationary models of conflict generation that can distinguish trends from statistical fluctuations in the statistics of war. These models indicate that both the long peace and the period of great violence that preceded it are not statistically uncommon patterns in realistic but stationary conflict time series. This fact does not detract from the importance of the long peace or the proposed mechanisms that explain it. However, the models indicate that the postwar pattern of peace would need to endure at least another 100 to 140 years to become a statistically significant trend. This fact places an implicit upper bound on the magnitude of any change in the true likelihood of a large war after the end of the Second World War. The historical patterns of war thus seem to imply that the long peace may be substantially more fragile than proponents believe, despite recent efforts to identify mechanisms that reduce the likelihood of interstate wars. PMID:29507877

  19. Development and Validation of a Statistical Shape Modeling-Based Finite Element Model of the Cervical Spine Under Low-Level Multiple Direction Loading Conditions

    PubMed Central

    Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.

    2014-01-01

    Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051

  20. An analysis of a large dataset on immigrant integration in Spain. The Statistical Mechanics perspective on Social Action

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia

    2014-02-01

    How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.

  1. Damage and strength of composite materials: Trends, predictions, and challenges

    NASA Technical Reports Server (NTRS)

    Obrien, T. Kevin

    1994-01-01

    Research on damage mechanisms and ultimate strength of composite materials relevant to scaling issues will be addressed in this viewgraph presentation. The use of fracture mechanics and Weibull statistics to predict scaling effects for the onset of isolated damage mechanisms will be highlighted. The ability of simple fracture mechanics models to predict trends that are useful in parametric or preliminary designs studies will be reviewed. The limitations of these simple models for complex loading conditions will also be noted. The difficulty in developing generic criteria for the growth of these mechanisms needed in progressive damage models to predict strength will be addressed. A specific example for a problem where failure is a direct consequence of progressive delamination will be explored. A damage threshold/fail-safety concept for addressing composite damage tolerance will be discussed.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallavotti, G.

    It is shown that the chaoticity hypothesis recently introduced in statistical mechanics, which is analogous to Ruelle`s principle for turbulence, implies the Onsager reciprocity and the fluctuation-dissipation theorem in various reversible models for coexisting transport phenomena.

  3. Double-slit experiment with single wave-driven particles and its relation to quantum mechanics.

    PubMed

    Andersen, Anders; Madsen, Jacob; Reichelt, Christian; Rosenlund Ahl, Sonja; Lautrup, Benny; Ellegaard, Clive; Levinsen, Mogens T; Bohr, Tomas

    2015-07-01

    In a thought-provoking paper, Couder and Fort [Phys. Rev. Lett. 97, 154101 (2006)] describe a version of the famous double-slit experiment performed with droplets bouncing on a vertically vibrated fluid surface. In the experiment, an interference pattern in the single-particle statistics is found even though it is possible to determine unambiguously which slit the walking droplet passes. Here we argue, however, that the single-particle statistics in such an experiment will be fundamentally different from the single-particle statistics of quantum mechanics. Quantum mechanical interference takes place between different classical paths with precise amplitude and phase relations. In the double-slit experiment with walking droplets, these relations are lost since one of the paths is singled out by the droplet. To support our conclusions, we have carried out our own double-slit experiment, and our results, in particular the long and variable slit passage times of the droplets, cast strong doubt on the feasibility of the interference claimed by Couder and Fort. To understand theoretically the limitations of wave-driven particle systems as analogs to quantum mechanics, we introduce a Schrödinger equation with a source term originating from a localized particle that generates a wave while being simultaneously guided by it. We show that the ensuing particle-wave dynamics can capture some characteristics of quantum mechanics such as orbital quantization. However, the particle-wave dynamics can not reproduce quantum mechanics in general, and we show that the single-particle statistics for our model in a double-slit experiment with an additional splitter plate differs qualitatively from that of quantum mechanics.

  4. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less

  5. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.

    PubMed

    Sinitskiy, Anton V; Voth, Gregory A

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.

  6. Generalized memory associativity in a network model for the neuroses

    NASA Astrophysics Data System (ADS)

    Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.

    2009-03-01

    We review concepts introduced in earlier work, where a neural network mechanism describes some mental processes in neurotic pathology and psychoanalytic working-through, as associative memory functioning, according to the findings of Freud. We developed a complex network model, where modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's idea that consciousness is related to symbolic and linguistic memory activity in the brain. We have introduced a generalization of the Boltzmann machine to model memory associativity. Model behavior is illustrated with simulations and some of its properties are analyzed with methods from statistical mechanics.

  7. On the statistical distribution in a deformed solid

    NASA Astrophysics Data System (ADS)

    Gorobei, N. N.; Luk'yanenko, A. S.

    2017-09-01

    A modification of the Gibbs distribution in a thermally insulated mechanically deformed solid, where its linear dimensions (shape parameters) are excluded from statistical averaging and included among the macroscopic parameters of state alongside with the temperature, is proposed. Formally, this modification is reduced to corresponding additional conditions when calculating the statistical sum. The shape parameters and the temperature themselves are found from the conditions of mechanical and thermal equilibria of a body, and their change is determined using the first law of thermodynamics. Known thermodynamic phenomena are analyzed for the simple model of a solid, i.e., an ensemble of anharmonic oscillators, within the proposed formalism with an accuracy of up to the first order by the anharmonicity constant. The distribution modification is considered for the classic and quantum temperature regions apart.

  8. Reading biological processes from nucleotide sequences

    NASA Astrophysics Data System (ADS)

    Murugan, Anand

    Cellular processes have traditionally been investigated by techniques of imaging and biochemical analysis of the molecules involved. The recent rapid progress in our ability to manipulate and read nucleic acid sequences gives us direct access to the genetic information that directs and constrains biological processes. While sequence data is being used widely to investigate genotype-phenotype relationships and population structure, here we use sequencing to understand biophysical mechanisms. We present work on two different systems. First, in chapter 2, we characterize the stochastic genetic editing mechanism that produces diverse T-cell receptors in the human immune system. We do this by inferring statistical distributions of the underlying biochemical events that generate T-cell receptor coding sequences from the statistics of the observed sequences. This inferred model quantitatively describes the potential repertoire of T-cell receptors that can be produced by an individual, providing insight into its potential diversity and the probability of generation of any specific T-cell receptor. Then in chapter 3, we present work on understanding the functioning of regulatory DNA sequences in both prokaryotes and eukaryotes. Here we use experiments that measure the transcriptional activity of large libraries of mutagenized promoters and enhancers and infer models of the sequence-function relationship from this data. For the bacterial promoter, we infer a physically motivated 'thermodynamic' model of the interaction of DNA-binding proteins and RNA polymerase determining the transcription rate of the downstream gene. For the eukaryotic enhancers, we infer heuristic models of the sequence-function relationship and use these models to find synthetic enhancer sequences that optimize inducibility of expression. Both projects demonstrate the utility of sequence information in conjunction with sophisticated statistical inference techniques for dissecting underlying biophysical mechanisms.

  9. Statistical distribution of mechanical properties for three graphite-epoxy material systems

    NASA Technical Reports Server (NTRS)

    Reese, C.; Sorem, J., Jr.

    1981-01-01

    Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.

  10. A Quantum Shuffling Game for Teaching Statistical Mechanics

    ERIC Educational Resources Information Center

    Black, P. J.; And Others

    1971-01-01

    A game simulating an Einstein model of a crystal producing a Boltzmann distribution. Computer-made films present the results with large distributions showing heat flow and some applications to entropy. (TS)

  11. Statistical testing of association between menstruation and migraine.

    PubMed

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  12. SPATIO-TEMPORAL ANALYSIS OF TOTAL NITRATE CONCENTRATIONS USING DYNAMIC STATISTICAL MODELS

    EPA Science Inventory

    Atmospheric concentrations of total nitrate (TNO3), defined here as gas-phase nitric acid plus particle-phase nitrate, are difficult to simulate in numerical air quality models due to the presence of a variety of formation pathways and loss mechanisms, some of which ar...

  13. North American extreme temperature events and related large scale meteorological patterns: A review of statistical methods, dynamics, modeling, and trends

    DOE PAGES

    Grotjahn, Richard; Black, Robert; Leung, Ruby; ...

    2015-05-22

    This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less

  14. TRACX2: a connectionist autoencoder using graded chunks to model infant visual statistical learning.

    PubMed

    Mareschal, Denis; French, Robert M

    2017-01-05

    Even newborn infants are able to extract structure from a stream of sensory inputs; yet how this is achieved remains largely a mystery. We present a connectionist autoencoder model, TRACX2, that learns to extract sequence structure by gradually constructing chunks, storing these chunks in a distributed manner across its synaptic weights and recognizing these chunks when they re-occur in the input stream. Chunks are graded rather than all-or-nothing in nature. As chunks are learnt their component parts become more and more tightly bound together. TRACX2 successfully models the data from five experiments from the infant visual statistical learning literature, including tasks involving forward and backward transitional probabilities, low-salience embedded chunk items, part-sequences and illusory items. The model also captures performance differences across ages through the tuning of a single-learning rate parameter. These results suggest that infant statistical learning is underpinned by the same domain-general learning mechanism that operates in auditory statistical learning and, potentially, in adult artificial grammar learning.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  15. TRACX2: a connectionist autoencoder using graded chunks to model infant visual statistical learning

    PubMed Central

    French, Robert M.

    2017-01-01

    Even newborn infants are able to extract structure from a stream of sensory inputs; yet how this is achieved remains largely a mystery. We present a connectionist autoencoder model, TRACX2, that learns to extract sequence structure by gradually constructing chunks, storing these chunks in a distributed manner across its synaptic weights and recognizing these chunks when they re-occur in the input stream. Chunks are graded rather than all-or-nothing in nature. As chunks are learnt their component parts become more and more tightly bound together. TRACX2 successfully models the data from five experiments from the infant visual statistical learning literature, including tasks involving forward and backward transitional probabilities, low-salience embedded chunk items, part-sequences and illusory items. The model also captures performance differences across ages through the tuning of a single-learning rate parameter. These results suggest that infant statistical learning is underpinned by the same domain-general learning mechanism that operates in auditory statistical learning and, potentially, in adult artificial grammar learning. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872375

  16. Statistical mechanics of influence maximization with thermal noise

    NASA Astrophysics Data System (ADS)

    Lynn, Christopher W.; Lee, Daniel D.

    2017-03-01

    The problem of optimally distributing a budget of influence among individuals in a social network, known as influence maximization, has typically been studied in the context of contagion models and deterministic processes, which fail to capture stochastic interactions inherent in real-world settings. Here, we show that by introducing thermal noise into influence models, the dynamics exactly resemble spins in a heterogeneous Ising system. In this way, influence maximization in the presence of thermal noise has a natural physical interpretation as maximizing the magnetization of an Ising system given a budget of external magnetic field. Using this statistical mechanical formulation, we demonstrate analytically that for small external-field budgets, the optimal influence solutions exhibit a highly non-trivial temperature dependence, focusing on high-degree hub nodes at high temperatures and on easily influenced peripheral nodes at low temperatures. For the general problem, we present a projected gradient ascent algorithm that uses the magnetic susceptibility to calculate locally optimal external-field distributions. We apply our algorithm to synthetic and real-world networks, demonstrating that our analytic results generalize qualitatively. Our work establishes a fruitful connection with statistical mechanics and demonstrates that influence maximization depends crucially on the temperature of the system, a fact that has not been appreciated by existing research.

  17. Coupled local facilitation and global hydrologic inhibition drive landscape geometry in a patterned peatland

    NASA Astrophysics Data System (ADS)

    Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.

    2015-05-01

    Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing-canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.

  18. Coupled local facilitation and global hydrologic inhibition drive landscape geometry in a patterned peatland

    NASA Astrophysics Data System (ADS)

    Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.

    2015-01-01

    Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.

  19. Statistical model selection for better prediction and discovering science mechanisms that affect reliability

    DOE PAGES

    Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.

    2015-08-19

    Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemore » inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.« less

  20. Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model.

    PubMed

    Wako, Hiroshi; Abe, Haruo

    2016-01-01

    The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding.

  1. Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model

    PubMed Central

    Wako, Hiroshi; Abe, Haruo

    2016-01-01

    The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding. PMID:28409079

  2. Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Ohzeki, Masayuki

    2013-09-01

    In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.

  3. Validation of a Statistical Methodology for Extracting Vegetation Feedbacks: Focus on North African Ecosystems in the Community Earth System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Yan; Notaro, Michael; Wang, Fuyao

    Generalized equilibrium feedback assessment (GEFA) is a potentially valuable multivariate statistical tool for extracting vegetation feedbacks to the atmosphere in either observations or coupled Earth system models. The reliability of GEFA at capturing the terrestrial impacts on regional climate is demonstrated in this paper using the National Center for Atmospheric Research Community Earth System Model (CESM), with focus on North Africa. The feedback is assessed statistically by applying GEFA to output from a fully coupled control run. To reduce the sampling error caused by short data records, the traditional or full GEFA is refined through stepwise GEFA by dropping unimportantmore » forcings. Two ensembles of dynamical experiments are developed for the Sahel or West African monsoon region against which GEFA-based vegetation feedbacks are evaluated. In these dynamical experiments, regional leaf area index (LAI) is modified either alone or in conjunction with soil moisture, with the latter runs motivated by strong regional soil moisture–LAI coupling. Stepwise GEFA boasts higher consistency between statistically and dynamically assessed atmospheric responses to land surface anomalies than full GEFA, especially with short data records. GEFA-based atmospheric responses are more consistent with the coupled soil moisture–LAI experiments, indicating that GEFA is assessing the combined impacts of coupled vegetation and soil moisture. Finally, both the statistical and dynamical assessments reveal a negative vegetation–rainfall feedback in the Sahel associated with an atmospheric stability mechanism in CESM versus a weaker positive feedback in the West African monsoon region associated with a moisture recycling mechanism in CESM.« less

  4. Validation of a Statistical Methodology for Extracting Vegetation Feedbacks: Focus on North African Ecosystems in the Community Earth System Model

    DOE PAGES

    Yu, Yan; Notaro, Michael; Wang, Fuyao; ...

    2018-02-05

    Generalized equilibrium feedback assessment (GEFA) is a potentially valuable multivariate statistical tool for extracting vegetation feedbacks to the atmosphere in either observations or coupled Earth system models. The reliability of GEFA at capturing the terrestrial impacts on regional climate is demonstrated in this paper using the National Center for Atmospheric Research Community Earth System Model (CESM), with focus on North Africa. The feedback is assessed statistically by applying GEFA to output from a fully coupled control run. To reduce the sampling error caused by short data records, the traditional or full GEFA is refined through stepwise GEFA by dropping unimportantmore » forcings. Two ensembles of dynamical experiments are developed for the Sahel or West African monsoon region against which GEFA-based vegetation feedbacks are evaluated. In these dynamical experiments, regional leaf area index (LAI) is modified either alone or in conjunction with soil moisture, with the latter runs motivated by strong regional soil moisture–LAI coupling. Stepwise GEFA boasts higher consistency between statistically and dynamically assessed atmospheric responses to land surface anomalies than full GEFA, especially with short data records. GEFA-based atmospheric responses are more consistent with the coupled soil moisture–LAI experiments, indicating that GEFA is assessing the combined impacts of coupled vegetation and soil moisture. Finally, both the statistical and dynamical assessments reveal a negative vegetation–rainfall feedback in the Sahel associated with an atmospheric stability mechanism in CESM versus a weaker positive feedback in the West African monsoon region associated with a moisture recycling mechanism in CESM.« less

  5. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition.

    PubMed

    Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2014-06-05

    In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.

  6. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition

    PubMed Central

    2014-01-01

    Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483

  7. Parametric Analysis to Study the Influence of Aerogel-Based Renders' Components on Thermal and Mechanical Performance.

    PubMed

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-05-04

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.

  8. Parametric Analysis to Study the Influence of Aerogel-Based Renders’ Components on Thermal and Mechanical Performance

    PubMed Central

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-01-01

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460

  9. Avalanches, loading and finite size effects in 2D amorphous plasticity: results from a finite element model

    NASA Astrophysics Data System (ADS)

    Sandfeld, Stefan; Budrikis, Zoe; Zapperi, Stefano; Fernandez Castellanos, David

    2015-02-01

    Crystalline plasticity is strongly interlinked with dislocation mechanics and nowadays is relatively well understood. Concepts and physical models of plastic deformation in amorphous materials on the other hand—where the concept of linear lattice defects is not applicable—still are lagging behind. We introduce an eigenstrain-based finite element lattice model for simulations of shear band formation and strain avalanches. Our model allows us to study the influence of surfaces and finite size effects on the statistics of avalanches. We find that even with relatively complex loading conditions and open boundary conditions, critical exponents describing avalanche statistics are unchanged, which validates the use of simpler scalar lattice-based models to study these phenomena.

  10. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  11. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  12. Random bursts determine dynamics of active filaments.

    PubMed

    Weber, Christoph A; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S; Bausch, Andreas R; Frey, Erwin

    2015-08-25

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system's dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model.

  13. Random bursts determine dynamics of active filaments

    PubMed Central

    Weber, Christoph A.; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S.; Bausch, Andreas R.; Frey, Erwin

    2015-01-01

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system’s dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model. PMID:26261319

  14. The unrealized promise of infant statistical word-referent learning

    PubMed Central

    Smith, Linda B.; Suanda, Sumarga H.; Yu, Chen

    2014-01-01

    Recent theory and experiments offer a new solution as to how infant learners may break into word learning, by using cross-situational statistics to find the underlying word-referent mappings. Computational models demonstrate the in-principle plausibility of this statistical learning solution and experimental evidence shows that infants can aggregate and make statistically appropriate decisions from word-referent co-occurrence data. We review these contributions and then identify the gaps in current knowledge that prevent a confident conclusion about whether cross-situational learning is the mechanism through which infants break into word learning. We propose an agenda to address that gap that focuses on detailing the statistics in the learning environment and the cognitive processes that make use of those statistics. PMID:24637154

  15. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning.

    PubMed

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-06-17

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults.

  16. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning

    PubMed Central

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-01-01

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults. PMID:27322273

  17. Modeling Selection and Extinction Mechanisms of Biological Systems

    NASA Astrophysics Data System (ADS)

    Amirjanov, Adil

    In this paper, the behavior of a genetic algorithm is modeled to enhance its applicability as a modeling tool of biological systems. A new description model for selection mechanism is introduced which operates on a portion of individuals of population. The extinction and recolonization mechanism is modeled, and solving the dynamics analytically shows that the genetic drift in the population with extinction/recolonization is doubled. The mathematical analysis of the interaction between selection and extinction/recolonization processes is carried out to assess the dynamics of motion of the macroscopic statistical properties of population. Computer simulations confirm that the theoretical predictions of described models are in good approximations. A mathematical model of GA dynamics was also examined, which describes the anti-predator vigilance in an animal group with respect to a known analytical solution of the problem, and showed a good agreement between them to find the evolutionarily stable strategies.

  18. Kinetic exchange models: From molecular physics to social science

    NASA Astrophysics Data System (ADS)

    Patriarca, Marco; Chakraborti, Anirban

    2013-08-01

    We discuss several multi-agent models that have their origin in the kinetic exchange theory of statistical mechanics and have been recently applied to a variety of problems in the social sciences. This class of models can be easily adapted for simulations in areas other than physics, such as the modeling of income and wealth distributions in economics and opinion dynamics in sociology.

  19. Comparison of animal discs used in disc research to human lumbar disc: torsion mechanics and collagen content.

    PubMed

    Showalter, Brent L; Beckstein, Jesse C; Martin, John T; Beattie, Elizabeth E; Espinoza Orías, Alejandro A; Schaer, Thomas P; Vresilovic, Edward J; Elliott, Dawn M

    2012-07-01

    Experimental measurement and normalization of in vitro disc torsion mechanics and collagen content for several animal species used in intervertebral disc research and comparing these with the human disc. To aid in the selection of appropriate animal models for disc research by measuring torsional mechanical properties and collagen content. There is lack of data and variability in testing protocols for comparing animal and human disc torsion mechanics and collagen content. Intervertebral disc torsion mechanics were measured and normalized by disc height and polar moment of inertia for 11 disc types in 8 mammalian species: the calf, pig, baboon, goat, sheep, rabbit, rat, and mouse lumbar discs, and cow, rat, and mouse caudal discs. Collagen content was measured and normalized by dry weight for the same discs except the rat and the mouse. Collagen fiber stretch in torsion was calculated using an analytical model. Measured torsion parameters varied by several orders of magnitude across the different species. After geometric normalization, only the sheep and pig discs were statistically different from human discs. Fiber stretch was found to be highly dependent on the assumed initial fiber angle. The collagen content of the discs was similar, especially in the outer annulus where only the calf and goat discs were statistically different from human. Disc collagen content did not correlate with torsion mechanics. Disc torsion mechanics are comparable with human lumbar discs in 9 of 11 disc types after normalization by geometry. The normalized torsion mechanics and collagen content of the multiple animal discs presented are useful for selecting and interpreting results for animal disc models. Structural organization of the fiber angle may explain the differences that were noted between species after geometric normalization.

  20. Comparison of Animal Discs Used in Disc Research to Human Lumbar Disc: Torsion Mechanics and Collagen Content

    PubMed Central

    Showalter, Brent L.; Beckstein, Jesse C.; Martin, John T.; Beattie, Elizabeth E.; Orías, Alejandro A. Espinoza; Schaer, Thomas P.; Vresilovic, Edward J.; Elliott, Dawn M.

    2012-01-01

    Study Design Experimental measurement and normalization of in vitro disc torsion mechanics and collagen content for several animal species used in intervertebral disc research and comparing these to the human disc. Objective To aid in the selection of appropriate animal models for disc research by measuring torsional mechanical properties and collagen content. Summary of Background Data There is lack of data and variability in testing protocols for comparing animal and human disc torsion mechanics and collagen content. Methods Intervertebral disc torsion mechanics were measured and normalized by disc height and polar moment of inertia for 11 disc types in 8 mammalian species: the calf, pig, baboon, goat, sheep, rabbit, rat, and mouse lumbar, and cow, rat, and mouse caudal. Collagen content was measured and normalized by dry weight for the same discs except the rat and mouse. Collagen fiber stretch in torsion was calculated using an analytical model. Results Measured torsion parameters varied by several orders of magnitude across the different species. After geometric normalization, only the sheep and pig discs were statistically different from human. Fiber stretch was found to be highly dependent on the assumed initial fiber angle. The collagen content of the discs was similar, especially in the outer annulus where only the calf and goat discs were statistically different from human. Disc collagen content did not correlate with torsion mechanics. Conclusion Disc torsion mechanics are comparable to human lumbar discs in 9 of 11 disc types after normalization by geometry. The normalized torsion mechanics and collagen content of the multiple animal discs presented is useful for selecting and interpreting results for animal models of the disc. Structural composition of the disc, such as initial fiber angle, may explain the differences that were noted between species after geometric normalization. PMID:22333953

  1. Guidelines for the Investigation of Mediating Variables in Business Research.

    PubMed

    MacKinnon, David P; Coxe, Stefany; Baraldi, Amanda N

    2012-03-01

    Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized.

  2. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    PubMed Central

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  3. Reverse engineering systems models of regulation: discovery, prediction and mechanisms.

    PubMed

    Ashworth, Justin; Wurtmann, Elisabeth J; Baliga, Nitin S

    2012-08-01

    Biological systems can now be understood in comprehensive and quantitative detail using systems biology approaches. Putative genome-scale models can be built rapidly based upon biological inventories and strategic system-wide molecular measurements. Current models combine statistical associations, causative abstractions, and known molecular mechanisms to explain and predict quantitative and complex phenotypes. This top-down 'reverse engineering' approach generates useful organism-scale models despite noise and incompleteness in data and knowledge. Here we review and discuss the reverse engineering of biological systems using top-down data-driven approaches, in order to improve discovery, hypothesis generation, and the inference of biological properties. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. A statistical physics viewpoint on the dynamics of the bouncing ball

    NASA Astrophysics Data System (ADS)

    Chastaing, Jean-Yonnel; Géminard, Jean-Christophe; Bertin, Eric

    2016-06-01

    We compute, in a statistical physics perspective, the dynamics of a bouncing ball maintained in a chaotic regime thanks to collisions with a plate experiencing an aperiodic vibration. We analyze in details the energy exchanges between the bead and the vibrating plate, and show that the coupling between the bead and the plate can be modeled in terms of both a dissipative process and an injection mechanism by an energy reservoir. An analysis of the injection statistics in terms of fluctuation relation is also provided.

  5. A probabilistic mechanical model for prediction of aggregates’ size distribution effect on concrete compressive strength

    NASA Astrophysics Data System (ADS)

    Miled, Karim; Limam, Oualid; Sab, Karam

    2012-06-01

    To predict aggregates' size distribution effect on the concrete compressive strength, a probabilistic mechanical model is proposed. Within this model, a Voronoi tessellation of a set of non-overlapping and rigid spherical aggregates is used to describe the concrete microstructure. Moreover, aggregates' diameters are defined as statistical variables and their size distribution function is identified to the experimental sieve curve. Then, an inter-aggregate failure criterion is proposed to describe the compressive-shear crushing of the hardened cement paste when concrete is subjected to uniaxial compression. Using a homogenization approach based on statistical homogenization and on geometrical simplifications, an analytical formula predicting the concrete compressive strength is obtained. This formula highlights the effects of cement paste strength and aggregates' size distribution and volume fraction on the concrete compressive strength. According to the proposed model, increasing the concrete strength for the same cement paste and the same aggregates' volume fraction is obtained by decreasing both aggregates' maximum size and the percentage of coarse aggregates. Finally, the validity of the model has been discussed through a comparison with experimental results (15 concrete compressive strengths ranging between 46 and 106 MPa) taken from literature and showing a good agreement with the model predictions.

  6. A Modified Mechanical Threshold Stress Constitutive Model for Austenitic Stainless Steels

    NASA Astrophysics Data System (ADS)

    Prasad, K. Sajun; Gupta, Amit Kumar; Singh, Yashjeet; Singh, Swadesh Kumar

    2016-12-01

    This paper presents a modified mechanical threshold stress (m-MTS) constitutive model. The m-MTS model incorporates variable athermal and dynamic strain aging (DSA) Components to accurately predict the flow stress behavior of austenitic stainless steels (ASS)-316 and 304. Under strain rate variations between 0.01-0.0001 s-1, uniaxial tensile tests were conducted at temperatures ranging from 50-650 °C to evaluate the material constants of constitutive models. The test results revealed the high dependence of flow stress on strain, strain rate and temperature. In addition, it was observed that DSA occurred at elevated temperatures and very low strain rates, causing an increase in flow stress. While the original MTS model is capable of predicting the flow stress behavior for ASS, statistical parameters point out the inefficiency of the model when compared to other models such as Johnson Cook model, modified Zerilli-Armstrong (m-ZA) model, and modified Arrhenius-type equations (m-Arr). Therefore, in order to accurately model both the DSA and non-DSA regimes, the original MTS model was modified by incorporating variable athermal and DSA components. The suitability of the m-MTS model was assessed by comparing the statistical parameters. It was observed that the m-MTS model was highly accurate for the DSA regime when compared to the existing models. However, models like m-ZA and m-Arr showed better results for the non-DSA regime.

  7. Theory of Alike Selectivity in Biological Channels

    NASA Technical Reports Server (NTRS)

    Luchinsky, Dmitry G.; Gibby, Will A. T.; Kaufman, Igor Kh.; Eisenberg, Robert S.; McClintock, Peter V. E.

    2016-01-01

    We introduce a statistical mechanical model of the selectivity filter that accounts for the interaction between ions within the channel and derive Eisenman equation of the filter selectivity directly from the condition of barrier-less conduction.

  8. Effect of heating rate and kinetic model selection on activation energy of nonisothermal crystallization of amorphous felodipine.

    PubMed

    Chattoraj, Sayantan; Bhugra, Chandan; Li, Zheng Jane; Sun, Changquan Calvin

    2014-12-01

    The nonisothermal crystallization kinetics of amorphous materials is routinely analyzed by statistically fitting the crystallization data to kinetic models. In this work, we systematically evaluate how the model-dependent crystallization kinetics is impacted by variations in the heating rate and the selection of the kinetic model, two key factors that can lead to significant differences in the crystallization activation energy (Ea ) of an amorphous material. Using amorphous felodipine, we show that the Ea decreases with increase in the heating rate, irrespective of the kinetic model evaluated in this work. The model that best describes the crystallization phenomenon cannot be identified readily through the statistical fitting approach because several kinetic models yield comparable R(2) . Here, we propose an alternate paired model-fitting model-free (PMFMF) approach for identifying the most suitable kinetic model, where Ea obtained from model-dependent kinetics is compared with those obtained from model-free kinetics. The most suitable kinetic model is identified as the one that yields Ea values comparable with the model-free kinetics. Through this PMFMF approach, nucleation and growth is identified as the main mechanism that controls the crystallization kinetics of felodipine. Using this PMFMF approach, we further demonstrate that crystallization mechanism from amorphous phase varies with heating rate. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  9. Proceedings: Conference on Computers in Chemical Education and Research, Dekalb, Illinois, 19-23 July 1971.

    ERIC Educational Resources Information Center

    1971

    Computers have effected a comprehensive transformation of chemistry. Computers have greatly enhanced the chemist's ability to do model building, simulations, data refinement and reduction, analysis of data in terms of models, on-line data logging, automated control of experiments, quantum chemistry and statistical and mechanical calculations, and…

  10. Statistical mechanics of binary mixture adsorption in metal-organic frameworks in the osmotic ensemble.

    PubMed

    Dunne, Lawrence J; Manos, George

    2018-03-13

    Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO 2 and CH 4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO 2 and CH 4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes.This article is part of the theme issue 'Modern theoretical chemistry'. © 2018 The Author(s).

  11. Statistical mechanics of binary mixture adsorption in metal-organic frameworks in the osmotic ensemble

    NASA Astrophysics Data System (ADS)

    Dunne, Lawrence J.; Manos, George

    2018-03-01

    Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO2 and CH4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO2 and CH4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes. This article is part of the theme issue `Modern theoretical chemistry'.

  12. Quarks, Symmetries and Strings - a Symposium in Honor of Bunji Sakita's 60th Birthday

    NASA Astrophysics Data System (ADS)

    Kaku, M.; Jevicki, A.; Kikkawa, K.

    1991-04-01

    The Table of Contents for the full book PDF is as follows: * Preface * Evening Banquet Speech * I. Quarks and Phenomenology * From the SU(6) Model to Uniqueness in the Standard Model * A Model for Higgs Mechanism in the Standard Model * Quark Mass Generation in QCD * Neutrino Masses in the Standard Model * Solar Neutrino Puzzle, Horizontal Symmetry of Electroweak Interactions and Fermion Mass Hierarchies * State of Chiral Symmetry Breaking at High Temperatures * Approximate |ΔI| = 1/2 Rule from a Perspective of Light-Cone Frame Physics * Positronium (and Some Other Systems) in a Strong Magnetic Field * Bosonic Technicolor and the Flavor Problem * II. Strings * Supersymmetry in String Theory * Collective Field Theory and Schwinger-Dyson Equations in Matrix Models * Non-Perturbative String Theory * The Structure of Non-Perturbative Quantum Gravity in One and Two Dimensions * Noncritical Virasoro Algebra of d < 1 Matrix Model and Quantized String Field * Chaos in Matrix Models ? * On the Non-Commutative Symmetry of Quantum Gravity in Two Dimensions * Matrix Model Formulation of String Field Theory in One Dimension * Geometry of the N = 2 String Theory * Modular Invariance form Gauge Invariance in the Non-Polynomial String Field Theory * Stringy Symmetry and Off-Shell Ward Identities * q-Virasoro Algebra and q-Strings * Self-Tuning Fields and Resonant Correlations in 2d-Gravity * III. Field Theory Methods * Linear Momentum and Angular Momentum in Quaternionic Quantum Mechanics * Some Comments on Real Clifford Algebras * On the Quantum Group p-adics Connection * Gravitational Instantons Revisited * A Generalized BBGKY Hierarchy from the Classical Path-Integral * A Quantum Generated Symmetry: Group-Level Duality in Conformal and Topological Field Theory * Gauge Symmetries in Extended Objects * Hidden BRST Symmetry and Collective Coordinates * Towards Stochastically Quantizing Topological Actions * IV. Statistical Methods * A Brief Summary of the s-Channel Theory of Superconductivity * Neural Networks and Models for the Brain * Relativistic One-Body Equations for Planar Particles with Arbitrary Spin * Chiral Property of Quarks and Hadron Spectrum in Lattice QCD * Scalar Lattice QCD * Semi-Superconductivity of a Charged Anyon Gas * Two-Fermion Theory of Strongly Correlated Electrons and Charge-Spin Separation * Statistical Mechanics and Error-Correcting Codes * Quantum Statistics

  13. A statistical model of brittle fracture by transgranular cleavage

    NASA Astrophysics Data System (ADS)

    Lin, Tsann; Evans, A. G.; Ritchie, R. O.

    A MODEL for brittle fracture by transgranular cleavage cracking is presented based on the application of weakest link statistics to the critical microstructural fracture mechanisms. The model permits prediction of the macroscopic fracture toughness, KI c, in single phase microstructures containing a known distribution of particles, and defines the critical distance from the crack tip at which the initial cracking event is most probable. The model is developed for unstable fracture ahead of a sharp crack considering both linear elastic and nonlinear elastic ("elastic/plastic") crack tip stress fields. Predictions are evaluated by comparison with experimental results on the low temperature flow and fracture behavior of a low carbon mild steel with a simple ferrite/grain boundary carbide microstructure.

  14. Improved Rubin-Bodner Model for the Prediction of Soft Tissue Deformations

    PubMed Central

    Zhang, Guangming; Xia, James J.; Liebschner, Michael; Zhang, Xiaoyan; Kim, Daeseung; Zhou, Xiaobo

    2016-01-01

    In craniomaxillofacial (CMF) surgery, a reliable way of simulating the soft tissue deformation resulted from skeletal reconstruction is vitally important for preventing the risks of facial distortion postoperatively. However, it is difficult to simulate the soft tissue behaviors affected by different types of CMF surgery. This study presents an integrated bio-mechanical and statistical learning model to improve accuracy and reliability of predictions on soft facial tissue behavior. The Rubin-Bodner (RB) model is initially used to describe the biomechanical behavior of the soft facial tissue. Subsequently, a finite element model (FEM) computers the stress of each node in soft facial tissue mesh data resulted from bone displacement. Next, the Generalized Regression Neural Network (GRNN) method is implemented to obtain the relationship between the facial soft tissue deformation and the stress distribution corresponding to different CMF surgical types and to improve evaluation of elastic parameters included in the RB model. Therefore, the soft facial tissue deformation can be predicted by biomechanical properties and statistical model. Leave-one-out cross-validation is used on eleven patients. As a result, the average prediction error of our model (0.7035mm) is lower than those resulting from other approaches. It also demonstrates that the more accurate bio-mechanical information the model has, the better prediction performance it could achieve. PMID:27717593

  15. Bayesian comparison of conceptual models of abrupt climate changes during the last glacial period

    NASA Astrophysics Data System (ADS)

    Boers, Niklas; Ghil, Michael; Rousseau, Denis-Didier

    2017-04-01

    Records of oxygen isotope ratios and dust concentrations from the North Greenland Ice Core Project (NGRIP) provide accurate proxies for the evolution of Arctic temperature and atmospheric circulation during the last glacial period (12ka to 100ka b2k) [1]. The most distinctive feature of these records are sudden transitions, called Dansgaard-Oeschger (DO) events, during which Arctic temperatures increased by up to 10 K within a few decades. These warming events are consistently followed by more gradual cooling in Antarctica [2]. The physical mechanisms responsible for these transitions and their out-of-phase relationship between the northern and southern hemisphere remain unclear. Substantial evidence hints at variations of the Atlantic Meridional Overturning Circulation as a key mechanism [2,3], but also other mechanisms, such as variations of sea ice extent [4] or ice shelf coverage [5] may play an important role. Here, we intend to shed more light on the relevance of the different mechanisms suggested to explain the abrupt climate changes and their inter-hemispheric coupling. For this purpose, several conceptual differential equation models are developed that represent the suggested physical mechanisms. Optimal parameters for each model candidate are then determined via maximum likelihood estimation with respect to the observed paleoclimatic data. Our approach is thus semi-empirical: While a model's general form is deduced from physical arguments about relevant climatic mechanisms — oceanic and atmospheric — its specific parameters are obtained by training the model on observed data. The distinct model candidates are evaluated by comparing statistical properties of time series simulated with these models to the observed statistics. In particular, Bayesian model selection criteria like Maximum Likelihood Ratio tests are used to obtain a hierarchy of the different candidates in terms of their likelihood, given the observed oxygen isotope and dust time series. [1] Kindler et al., Clim. Past (2014) [2] WAIS, Nature (2015) [3] Henry et al., Science (2016) [4] Gildor and Tziperman, Phil. Trans. R. Soc. (2003) [5] Petersen et al., Paleoceanography (2013)

  16. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  17. [Micro-simulation of firms' heterogeneity on pollution intensity and regional characteristics].

    PubMed

    Zhao, Nan; Liu, Yi; Chen, Ji-Ning

    2009-11-01

    In the same industrial sector, heterogeneity of pollution intensity exists among firms. There are some errors if using sector's average pollution intensity, which are calculated by limited number of firms in environmental statistic database to represent the sector's regional economic-environmental status. Based on the production function which includes environmental depletion as input, a micro-simulation model on firms' operational decision making is proposed. Then the heterogeneity of firms' pollution intensity can be mechanically described. Taking the mechanical manufacturing sector in Deyang city, 2005 as the case, the model's parameters were estimated. And the actual COD emission intensities of environmental statistic firms can be properly matched by the simulation. The model's results also show that the regional average COD emission intensity calculated by the environmental statistic firms (0.002 6 t per 10 000 yuan fixed asset, 0.001 5 t per 10 000 yuan production value) is lower than the regional average intensity calculated by all the firms in the region (0.003 0 t per 10 000 yuan fixed asset, 0.002 3 t per 10 000 yuan production value). The difference among average intensities in the six counties is significant as well. These regional characteristics of pollution intensity attribute to the sector's inner-structure (firms' scale distribution, technology distribution) and its spatial deviation.

  18. Shear band formation in plastic bonded explosive (PBX)

    NASA Astrophysics Data System (ADS)

    Dey, T. N.; Johnson, J. N.

    1998-07-01

    Adiabatic shear bands can be a source of ignition and lead to detonation. At low to moderate deformation rates, 10-1000 s-1, two other mechanisms can also give rise to shear bands. These mechanisms are: 1) softening caused by micro-cracking and 2) a constitutive response with a non-associated flow rule as is observed in granular material such as soil. Brittle behavior at small strains and the granular nature of HMX suggest that PBX-9501 constitutive behavior may be similar to sand. A constitutive model for the first of these mechanisms is studied in a series of calculations. This viscoelastic constitutive model for PBX-9501 softens via a statistical crack model. A sand model is used to provide a non-associated flow rule and detailed results will be reported elsewhere. Both models generate shear band formation at 1-2% strain at nominal strain rates at and below 1000 s-1. Shear band formation is suppressed at higher strain rates. Both mechanisms may accelerate the formation of adiabatic shear bands.

  19. Is the hypothesis about a low entropy initial state of the Universe necessary for explaining the arrow of time?

    NASA Astrophysics Data System (ADS)

    Goldstein, Sheldon; Tumulka, Roderich; Zanghı, Nino

    2016-07-01

    According to statistical mechanics, microstates of an isolated physical system (say, a gas in a box) at time t0 in a given macrostate of less-than-maximal entropy typically evolve in such a way that the entropy at time t increases with |t -t0| in both time directions. In order to account for the observed entropy increase in only one time direction, the thermodynamic arrow of time, one usually appeals to the hypothesis that the initial state of the Universe was one of very low entropy. In certain recent models of cosmology, however, no hypothesis about the initial state of the Universe is invoked. We discuss how the emergence of a thermodynamic arrow of time in such models can nevertheless be compatible with the above-mentioned consequence of statistical mechanics, appearances to the contrary notwithstanding.

  20. Multilayer Statistical Intrusion Detection in Wireless Networks

    NASA Astrophysics Data System (ADS)

    Hamdi, Mohamed; Meddeb-Makhlouf, Amel; Boudriga, Noureddine

    2008-12-01

    The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs). This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT) is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.

  1. Multi-fidelity machine learning models for accurate bandgap predictions of solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab

    Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less

  2. Statistical mechanics of an ideal active fluid confined in a channel

    NASA Astrophysics Data System (ADS)

    Wagner, Caleb; Baskaran, Aparna; Hagan, Michael

    The statistical mechanics of ideal active Brownian particles (ABPs) confined in a channel is studied by obtaining the exact solution of the steady-state Smoluchowski equation for the 1-particle distribution function. The solution is derived using results from the theory of two-way diffusion equations, combined with an iterative procedure that is justified by numerical results. Using this solution, we quantify the effects of confinement on the spatial and orientational order of the ensemble. Moreover, we rigorously show that both the bulk density and the fraction of particles on the channel walls obey simple scaling relations as a function of channel width. By considering a constant-flux steady state, an effective diffusivity for ABPs is derived which shows signatures of the persistent motion that characterizes ABP trajectories. Finally, we discuss how our techniques generalize to other active models, including systems whose activity is modeled in terms of an Ornstein-Uhlenbeck process.

  3. Thermodynamics of Biological Processes

    PubMed Central

    Garcia, Hernan G.; Kondev, Jane; Orme, Nigel; Theriot, Julie A.; Phillips, Rob

    2012-01-01

    There is a long and rich tradition of using ideas from both equilibrium thermodynamics and its microscopic partner theory of equilibrium statistical mechanics. In this chapter, we provide some background on the origins of the seemingly unreasonable effectiveness of ideas from both thermodynamics and statistical mechanics in biology. After making a description of these foundational issues, we turn to a series of case studies primarily focused on binding that are intended to illustrate the broad biological reach of equilibrium thinking in biology. These case studies include ligand-gated ion channels, thermodynamic models of transcription, and recent applications to the problem of bacterial chemotaxis. As part of the description of these case studies, we explore a number of different uses of the famed Monod–Wyman–Changeux (MWC) model as a generic tool for providing a mathematical characterization of two-state systems. These case studies should provide a template for tailoring equilibrium ideas to other problems of biological interest. PMID:21333788

  4. Multi-fidelity machine learning models for accurate bandgap predictions of solids

    DOE PAGES

    Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab

    2016-12-28

    Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less

  5. Order statistics inference for describing topological coupling and mechanical symmetry breaking in multidomain proteins

    NASA Astrophysics Data System (ADS)

    Kononova, Olga; Jones, Lee; Barsegov, V.

    2013-09-01

    Cooperativity is a hallmark of proteins, many of which show a modular architecture comprising discrete structural domains. Detecting and describing dynamic couplings between structural regions is difficult in view of the many-body nature of protein-protein interactions. By utilizing the GPU-based computational acceleration, we carried out simulations of the protein forced unfolding for the dimer WW - WW of the all-β-sheet WW domains used as a model multidomain protein. We found that while the physically non-interacting identical protein domains (WW) show nearly symmetric mechanical properties at low tension, reflected, e.g., in the similarity of their distributions of unfolding times, these properties become distinctly different when tension is increased. Moreover, the uncorrelated unfolding transitions at a low pulling force become increasingly more correlated (dependent) at higher forces. Hence, the applied force not only breaks "the mechanical symmetry" but also couples the physically non-interacting protein domains forming a multi-domain protein. We call this effect "the topological coupling." We developed a new theory, inspired by order statistics, to characterize protein-protein interactions in multi-domain proteins. The method utilizes the squared-Gaussian model, but it can also be used in conjunction with other parametric models for the distribution of unfolding times. The formalism can be taken to the single-molecule experimental lab to probe mechanical cooperativity and domain communication in multi-domain proteins.

  6. The energetic cost of walking: a comparison of predictive methods.

    PubMed

    Kramer, Patricia Ann; Sylvester, Adam D

    2011-01-01

    The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is "best", but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species.

  7. A statistical mechanical theory for a two-dimensional model of water

    PubMed Central

    Urbic, Tomaz; Dill, Ken A.

    2010-01-01

    We develop a statistical mechanical model for the thermal and volumetric properties of waterlike fluids. Each water molecule is a two-dimensional disk with three hydrogen-bonding arms. Each water interacts with neighboring waters through a van der Waals interaction and an orientation-dependent hydrogen-bonding interaction. This model, which is largely analytical, is a variant of the Truskett and Dill (TD) treatment of the “Mercedes-Benz” (MB) model. The present model gives better predictions than TD for hydrogen-bond populations in liquid water by distinguishing strong cooperative hydrogen bonds from weaker ones. We explore properties versus temperature T and pressure p. We find that the volumetric and thermal properties follow the same trends with T as real water and are in good general agreement with Monte Carlo simulations of MB water, including the density anomaly, the minimum in the isothermal compressibility, and the decreased number of hydrogen bonds for increasing temperature. The model reproduces that pressure squeezes out water’s heat capacity and leads to a negative thermal expansion coefficient at low temperatures. In terms of water structuring, the variance in hydrogen-bonding angles increases with both T and p, while the variance in water density increases with T but decreases with p. Hydrogen bonding is an energy storage mechanism that leads to water’s large heat capacity (for its size) and to the fragility in its cagelike structures, which are easily melted by temperature and pressure to a more van der Waals-like liquid state. PMID:20550408

  8. A statistical mechanical theory for a two-dimensional model of water

    NASA Astrophysics Data System (ADS)

    Urbic, Tomaz; Dill, Ken A.

    2010-06-01

    We develop a statistical mechanical model for the thermal and volumetric properties of waterlike fluids. Each water molecule is a two-dimensional disk with three hydrogen-bonding arms. Each water interacts with neighboring waters through a van der Waals interaction and an orientation-dependent hydrogen-bonding interaction. This model, which is largely analytical, is a variant of the Truskett and Dill (TD) treatment of the "Mercedes-Benz" (MB) model. The present model gives better predictions than TD for hydrogen-bond populations in liquid water by distinguishing strong cooperative hydrogen bonds from weaker ones. We explore properties versus temperature T and pressure p. We find that the volumetric and thermal properties follow the same trends with T as real water and are in good general agreement with Monte Carlo simulations of MB water, including the density anomaly, the minimum in the isothermal compressibility, and the decreased number of hydrogen bonds for increasing temperature. The model reproduces that pressure squeezes out water's heat capacity and leads to a negative thermal expansion coefficient at low temperatures. In terms of water structuring, the variance in hydrogen-bonding angles increases with both T and p, while the variance in water density increases with T but decreases with p. Hydrogen bonding is an energy storage mechanism that leads to water's large heat capacity (for its size) and to the fragility in its cagelike structures, which are easily melted by temperature and pressure to a more van der Waals-like liquid state.

  9. A statistical mechanical theory for a two-dimensional model of water.

    PubMed

    Urbic, Tomaz; Dill, Ken A

    2010-06-14

    We develop a statistical mechanical model for the thermal and volumetric properties of waterlike fluids. Each water molecule is a two-dimensional disk with three hydrogen-bonding arms. Each water interacts with neighboring waters through a van der Waals interaction and an orientation-dependent hydrogen-bonding interaction. This model, which is largely analytical, is a variant of the Truskett and Dill (TD) treatment of the "Mercedes-Benz" (MB) model. The present model gives better predictions than TD for hydrogen-bond populations in liquid water by distinguishing strong cooperative hydrogen bonds from weaker ones. We explore properties versus temperature T and pressure p. We find that the volumetric and thermal properties follow the same trends with T as real water and are in good general agreement with Monte Carlo simulations of MB water, including the density anomaly, the minimum in the isothermal compressibility, and the decreased number of hydrogen bonds for increasing temperature. The model reproduces that pressure squeezes out water's heat capacity and leads to a negative thermal expansion coefficient at low temperatures. In terms of water structuring, the variance in hydrogen-bonding angles increases with both T and p, while the variance in water density increases with T but decreases with p. Hydrogen bonding is an energy storage mechanism that leads to water's large heat capacity (for its size) and to the fragility in its cagelike structures, which are easily melted by temperature and pressure to a more van der Waals-like liquid state.

  10. Sex differences in mechanical allodynia: how can it be preclinically quantified and analyzed?

    PubMed Central

    Nicotra, Lauren; Tuke, Jonathan; Grace, Peter M.; Rolan, Paul E.; Hutchinson, Mark R.

    2014-01-01

    Translating promising preclinical drug discoveries to successful clinical trials remains a significant hurdle in pain research. Although animal models have significantly contributed to understanding chronic pain pathophysiology, the majority of research has focused on male rodents using testing procedures that produce sex difference data that do not align well with comparable clinical experiences. Additionally, the use of animal pain models presents ongoing ethical challenges demanding continuing refinement of preclinical methods. To this end, this study sought to test a quantitative allodynia assessment technique and associated statistical analysis in a modified graded nerve injury pain model with the aim to further examine sex differences in allodynia. Graded allodynia was established in male and female Sprague Dawley rats by altering the number of sutures placed around the sciatic nerve and quantified by the von Frey test. Linear mixed effects modeling regressed response on each fixed effect (sex, oestrus cycle, pain treatment). On comparison with other common von Frey assessment techniques, utilizing lower threshold filaments than those ordinarily tested, at 1 s intervals, appropriately and successfully investigated female mechanical allodynia, revealing significant sex and oestrus cycle difference across the graded allodynia that other common behavioral methods were unable to detect. Utilizing this different von Frey approach and graded allodynia model, a single suture inflicting less allodynia was sufficient to demonstrate exaggerated female mechanical allodynia throughout the phases of dioestrus and pro-oestrus. Refining the von Frey testing method, statistical analysis technique and the use of a graded model of chronic pain, allowed for examination of the influences on female mechanical nociception that other von Frey methods cannot provide. PMID:24592221

  11. Relating triggering processes in lab experiments with earthquakes.

    NASA Astrophysics Data System (ADS)

    Baro Urbea, J.; Davidsen, J.; Kwiatek, G.; Charalampidou, E. M.; Goebel, T.; Stanchits, S. A.; Vives, E.; Dresen, G.

    2016-12-01

    Statistical relations such as Gutenberg-Richter's, Omori-Utsu's and the productivity of aftershocks were first observed in seismology, but are also common to other physical phenomena exhibiting avalanche dynamics such as solar flares, rock fracture, structural phase transitions and even stock market transactions. All these examples exhibit spatio-temporal correlations that can be explained as triggering processes: Instead of being activated as a response to external driving or fluctuations, some events are consequence of previous activity. Although different plausible explanations have been suggested in each system, the ubiquity of such statistical laws remains unknown. However, the case of rock fracture may exhibit a physical connection with seismology. It has been suggested that some features of seismology have a microscopic origin and are reproducible over a vast range of scales. This hypothesis has motivated mechanical experiments to generate artificial catalogues of earthquakes at a laboratory scale -so called labquakes- and under controlled conditions. Microscopic fractures in lab tests release elastic waves that are recorded as ultrasonic (kHz-MHz) acoustic emission (AE) events by means of piezoelectric transducers. Here, we analyse the statistics of labquakes recorded during the failure of small samples of natural rocks and artificial porous materials under different controlled compression regimes. Temporal and spatio-temporal correlations are identified in certain cases. Specifically, we distinguish between the background and triggered events, revealing some differences in the statistical properties. We fit the data to statistical models of seismicity. As a particular case, we explore the branching process approach simplified in the Epidemic Type Aftershock Sequence (ETAS) model. We evaluate the empirical spatio-temporal kernel of the model and investigate the physical origins of triggering. Our analysis of the focal mechanisms implies that the occurrence of the empirical laws extends well beyond purely frictional sliding events, in contrast to what is often assumed.

  12. Using a coupled hydro-mechanical fault model to better understand the risk of induced seismicity in deep geothermal projects

    NASA Astrophysics Data System (ADS)

    Abe, Steffen; Krieger, Lars; Deckert, Hagen

    2017-04-01

    The changes of fluid pressures related to the injection of fluids into the deep underground, for example during geothermal energy production, can potentially reactivate faults and thus cause induced seismic events. Therefore, an important aspect in the planning and operation of such projects, in particular in densely populated regions such as the Upper Rhine Graben in Germany, is the estimation and mitigation of the induced seismic risk. The occurrence of induced seismicity depends on a combination of hydraulic properties of the underground, mechanical and geometric parameters of the fault, and the fluid injection regime. In this study we are therefore employing a numerical model to investigate the impact of fluid pressure changes on the dynamics of the faults and the resulting seismicity. The approach combines a model of the fluid flow around a geothermal well based on a 3D finite difference discretisation of the Darcy-equation with a 2D block-slider model of a fault. The models are coupled so that the evolving pore pressure at the relevant locations of the hydraulic model is taken into account in the calculation of the stick-slip dynamics of the fault model. Our modelling approach uses two subsequent modelling steps. Initially, the fault model is run by applying a fixed deformation rate for a given duration and without the influence of the hydraulic model in order to generate the background event statistics. Initial tests have shown that the response of the fault to hydraulic loading depends on the timing of the fluid injection relative to the seismic cycle of the fault. Therefore, multiple snapshots of the fault's stress- and displacement state are generated from the fault model. In a second step, these snapshots are then used as initial conditions in a set of coupled hydro-mechanical model runs including the effects of the fluid injection. This set of models is then compared with the background event statistics to evaluate the change in the probability of seismic events. The event data such as location, magnitude, and source characteristics can be used as input for numerical wave propagation models. This allows the translation of seismic event statistics generated by the model into ground shaking probabilities.

  13. Flexible kinematic earthquake rupture inversion of tele-seismic waveforms: Application to the 2013 Balochistan, Pakistan earthquake

    NASA Astrophysics Data System (ADS)

    Shimizu, K.; Yagi, Y.; Okuwaki, R.; Kasahara, A.

    2017-12-01

    The kinematic earthquake rupture models are useful to derive statistics and scaling properties of the large and great earthquakes. However, the kinematic rupture models for the same earthquake are often different from one another. Such sensitivity of the modeling prevents us to understand the statistics and scaling properties of the earthquakes. Yagi and Fukahata (2011) introduces the uncertainty of Green's function into the tele-seismic waveform inversion, and shows that the stable spatiotemporal distribution of slip-rate can be obtained by using an empirical Bayesian scheme. One of the unsolved problems in the inversion rises from the modeling error originated from an uncertainty of a fault-model setting. Green's function near the nodal plane of focal mechanism is known to be sensitive to the slight change of the assumed fault geometry, and thus the spatiotemporal distribution of slip-rate should be distorted by the modeling error originated from the uncertainty of the fault model. We propose a new method accounting for the complexity in the fault geometry by additionally solving the focal mechanism on each space knot. Since a solution of finite source inversion gets unstable with an increasing of flexibility of the model, we try to estimate a stable spatiotemporal distribution of focal mechanism in the framework of Yagi and Fukahata (2011). We applied the proposed method to the 52 tele-seismic P-waveforms of the 2013 Balochistan, Pakistan earthquake. The inverted-potency distribution shows unilateral rupture propagation toward southwest of the epicenter, and the spatial variation of the focal mechanisms shares the same pattern as the fault-curvature along the tectonic fabric. On the other hand, the broad pattern of rupture process, including the direction of rupture propagation, cannot be reproduced by an inversion analysis under the assumption that the faulting occurred on a single flat plane. These results show that the modeling error caused by simplifying the fault model is non-negligible in the tele-seismic waveform inversion of the 2013 Balochistan, Pakistan earthquake.

  14. Fatigue Life Prediction of Fiber-Reinforced Ceramic-Matrix Composites with Different Fiber Preforms at Room and Elevated Temperatures

    PubMed Central

    Li, Longbiao

    2016-01-01

    In this paper, the fatigue life of fiber-reinforced ceramic-matrix composites (CMCs) with different fiber preforms, i.e., unidirectional, cross-ply, 2D (two dimensional), 2.5D and 3D CMCs at room and elevated temperatures in air and oxidative environments, has been predicted using the micromechanics approach. An effective coefficient of the fiber volume fraction along the loading direction (ECFL) was introduced to describe the fiber architecture of preforms. The statistical matrix multicracking model and fracture mechanics interface debonding criterion were used to determine the matrix crack spacing and interface debonded length. Under cyclic fatigue loading, the fiber broken fraction was determined by combining the interface wear model and fiber statistical failure model at room temperature, and interface/fiber oxidation model, interface wear model and fiber statistical failure model at elevated temperatures, based on the assumption that the fiber strength is subjected to two-parameter Weibull distribution and the load carried by broken and intact fibers satisfies the Global Load Sharing (GLS) criterion. When the broken fiber fraction approaches the critical value, the composites fatigue fracture. PMID:28773332

  15. Territorial Developments Based on Graffiti: a Statistical Mechanics Approach

    DTIC Science & Technology

    2011-10-28

    defined on a lattice . We introduce a two-gang Hamiltonian model where agents have red or blue affiliation but are otherwise indistinguishable. In this...ramifications of our results. Keywords: Territorial Formation, Spin Systems, Phase Transitions 1. Introduction Lattice models have been extensively used in...inconsequential. In short, lattice models have proved extremely useful in the context of the physical, biological and even chemical sciences. In more

  16. Multiscale Modeling of Intergranular Fracture in Aluminum: Constitutive Relation For Interface Debonding

    NASA Technical Reports Server (NTRS)

    Yamakov, V.; Saether, E.; Glaessgen, E. H.

    2008-01-01

    Intergranular fracture is a dominant mode of failure in ultrafine grained materials. In the present study, the atomistic mechanisms of grain-boundary debonding during intergranular fracture in aluminum are modeled using a coupled molecular dynamics finite element simulation. Using a statistical mechanics approach, a cohesive-zone law in the form of a traction-displacement constitutive relationship, characterizing the load transfer across the plane of a growing edge crack, is extracted from atomistic simulations and then recast in a form suitable for inclusion within a continuum finite element model. The cohesive-zone law derived by the presented technique is free of finite size effects and is statistically representative for describing the interfacial debonding of a grain boundary (GB) interface examined at atomic length scales. By incorporating the cohesive-zone law in cohesive-zone finite elements, the debonding of a GB interface can be simulated in a coupled continuum-atomistic model, in which a crack starts in the continuum environment, smoothly penetrates the continuum-atomistic interface, and continues its propagation in the atomistic environment. This study is a step towards relating atomistically derived decohesion laws to macroscopic predictions of fracture and constructing multiscale models for nanocrystalline and ultrafine grained materials.

  17. A generalized statistical model for the size distribution of wealth

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  18. Appplication of statistical mechanical methods to the modeling of social networks

    NASA Astrophysics Data System (ADS)

    Strathman, Anthony Robert

    With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.

  19. Multiplicative point process as a model of trading activity

    NASA Astrophysics Data System (ADS)

    Gontis, V.; Kaulakys, B.

    2004-11-01

    Signals consisting of a sequence of pulses show that inherent origin of the 1/ f noise is a Brownian fluctuation of the average interevent time between subsequent pulses of the pulse sequence. In this paper, we generalize the model of interevent time to reproduce a variety of self-affine time series exhibiting power spectral density S( f) scaling as a power of the frequency f. Furthermore, we analyze the relation between the power-law correlations and the origin of the power-law probability distribution of the signal intensity. We introduce a stochastic multiplicative model for the time intervals between point events and analyze the statistical properties of the signal analytically and numerically. Such model system exhibits power-law spectral density S( f)∼1/ fβ for various values of β, including β= {1}/{2}, 1 and {3}/{2}. Explicit expressions for the power spectra in the low-frequency limit and for the distribution density of the interevent time are obtained. The counting statistics of the events is analyzed analytically and numerically, as well. The specific interest of our analysis is related with the financial markets, where long-range correlations of price fluctuations largely depend on the number of transactions. We analyze the spectral density and counting statistics of the number of transactions. The model reproduces spectral properties of the real markets and explains the mechanism of power-law distribution of trading activity. The study provides evidence that the statistical properties of the financial markets are enclosed in the statistics of the time interval between trades. A multiplicative point process serves as a consistent model generating this statistics.

  20. Devil's staircases, quantum dimer models, and stripe formation in strong coupling models of quantum frustration.

    NASA Astrophysics Data System (ADS)

    Raman, Kumar; Papanikolaou, Stefanos; Fradkin, Eduardo

    2007-03-01

    We construct a two-dimensional microscopic model of interacting quantum dimers that displays an infinite number of periodic striped phases in its T=0 phase diagram. The phases form an incomplete devil's staircase and the period becomes arbitrarily large as the staircase is traversed. The Hamiltonian has purely short-range interactions, does not break any symmetries, and is generic in that it does not involve the fine tuning of a large number of parameters. Our model, a quantum mechanical analog of the Pokrovsky-Talapov model of fluctuating domain walls in two dimensional classical statistical mechanics, provides a mechanism by which striped phases with periods large compared to the lattice spacing can, in principle, form in frustrated quantum magnetic systems with only short-ranged interactions and no explicitly broken symmetries. Please see cond-mat/0611390 for more details.

  1. Statistical model with two order parameters for ductile and soft fiber bundles in nanoscience and biomaterials.

    PubMed

    Rinaldi, Antonio

    2011-04-01

    Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).

  2. Digital morphogenesis via Schelling segregation

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2018-04-01

    Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.

  3. Process and metaphors in the evolutionary paradigm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, M.; Fox, S

    1988-01-01

    Presents thinking on the processes and interpretation of biological evolution, emphasizing the study of biological processes as they occur in living organisms and their communities, rather than through mechanical or statistical models. Contributors explore processes and metaphors in evolution, the origin of the genetic code, new genetic mechanisms and their implications for the formation of new species, panbiogeography, the active role of behavior in evolution, sociobiology, and more.

  4. Symmetry and Degeneracy in Quantum Mechanics. Self-Duality in Finite Spin Systems

    ERIC Educational Resources Information Center

    Osacar, C.; Pacheco, A. F.

    2009-01-01

    The symmetry of self-duality (Savit 1980 "Rev. Mod. Phys. 52" 453) of some models of statistical mechanics and quantum field theory is discussed for finite spin blocks of the Ising chain in a transverse magnetic field. The existence of this symmetry in a specific type of these blocks, and not in others, is manifest by the degeneracy of their…

  5. Guidelines for the Investigation of Mediating Variables in Business Research

    PubMed Central

    Coxe, Stefany; Baraldi, Amanda N.

    2013-01-01

    Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized. PMID:25237213

  6. The need for conducting forensic analysis of decommissioned bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...

  7. From Theory to Air Force Practice: Applications and Non-Binary Extensions of Probabilistic Model-Building Genetic Algorithms

    DTIC Science & Technology

    2006-05-31

    dynamics (MD) and kinetic Monte Carlo ( KMC ) procedures. In 2D surface modeling our calculations project speedups of 9 orders of magnitude at 300 degrees...programming is used to perform customized statistical mechanics by bridging the different time scales of MD and KMC quickly and well. Speedups in

  8. Power Law Patch Scaling and Lack of Characteristic Wavelength Suggest "Scale-Free" Processes Drive Pattern Formation in the Florida Everglades

    NASA Astrophysics Data System (ADS)

    Kaplan, D. A.; Casey, S. T.; Cohen, M. J.; Acharya, S.; Jawitz, J. W.

    2016-12-01

    A century of hydrologic modification has altered the physical and biological drivers of landscape processes in the Everglades (Florida, USA). Restoring the ridge-slough patterned landscape, a dominant feature of the historical system, is a priority, but requires an understanding of pattern genesis and degradation mechanisms. Physical experiments to evaluate alternative pattern formation mechanisms are limited by the long time scales of peat accumulation and loss, necessitating model-based comparisons, where support for a particular mechanism is based on model replication of extant patterning and trajectories of degradation. However, multiple mechanisms yield patch elongation in the direction of historical flow (a central feature of ridge-slough patterning), limiting the utility of that characteristic for discriminating among alternatives. Using data from vegetation maps, we investigated the statistical features of ridge-slough spatial patterning (ridge density, patch perimeter, elongation, patch-size distributions, and spatial periodicity) to establish more rigorous criteria for evaluating model performance and to inform controls on pattern variation across the contemporary system. Two independent analyses (2-D periodograms and patch size distributions) provide strong evidence against regular patterning, with the landscape exhibiting neither a characteristic wavelength nor a characteristic patch size, both of which are expected under conditions that produce regular patterns. Rather, landscape properties suggest robust scale-free patterning, indicating genesis from the coupled effects of local facilitation and a global negative feedback operating uniformly at the landscape-scale. This finding challenges widespread invocation of scale-dependent negative feedbacks for explaining ridge-slough pattern origins. These results help discern among genesis mechanisms and provide an improved statistical description of the landscape that can be used to compare among model outputs, as well as to assess the success of future restoration projects.

  9. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  10. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  11. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    NASA Astrophysics Data System (ADS)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  12. Resolving Structural Variability in Network Models and the Brain

    PubMed Central

    Klimm, Florian; Bassett, Danielle S.; Carlson, Jean M.; Mucha, Peter J.

    2014-01-01

    Large-scale white matter pathways crisscrossing the cortex create a complex pattern of connectivity that underlies human cognitive function. Generative mechanisms for this architecture have been difficult to identify in part because little is known in general about mechanistic drivers of structured networks. Here we contrast network properties derived from diffusion spectrum imaging data of the human brain with 13 synthetic network models chosen to probe the roles of physical network embedding and temporal network growth. We characterize both the empirical and synthetic networks using familiar graph metrics, but presented here in a more complete statistical form, as scatter plots and distributions, to reveal the full range of variability of each measure across scales in the network. We focus specifically on the degree distribution, degree assortativity, hierarchy, topological Rentian scaling, and topological fractal scaling—in addition to several summary statistics, including the mean clustering coefficient, the shortest path-length, and the network diameter. The models are investigated in a progressive, branching sequence, aimed at capturing different elements thought to be important in the brain, and range from simple random and regular networks, to models that incorporate specific growth rules and constraints. We find that synthetic models that constrain the network nodes to be physically embedded in anatomical brain regions tend to produce distributions that are most similar to the corresponding measurements for the brain. We also find that network models hardcoded to display one network property (e.g., assortativity) do not in general simultaneously display a second (e.g., hierarchy). This relative independence of network properties suggests that multiple neurobiological mechanisms might be at play in the development of human brain network architecture. Together, the network models that we develop and employ provide a potentially useful starting point for the statistical inference of brain network structure from neuroimaging data. PMID:24675546

  13. A canonical neural mechanism for behavioral variability

    NASA Astrophysics Data System (ADS)

    Darshan, Ran; Wood, William E.; Peters, Susan; Leblois, Arthur; Hansel, David

    2017-05-01

    The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5-6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these `universal' statistics.

  14. Multifractality and freezing phenomena in random energy landscapes: An introduction

    NASA Astrophysics Data System (ADS)

    Fyodorov, Yan V.

    2010-10-01

    We start our lectures with introducing and discussing the general notion of multifractality spectrum for random measures on lattices, and how it can be probed using moments of that measure. Then we show that the Boltzmann-Gibbs probability distributions generated by logarithmically correlated random potentials provide a simple yet non-trivial example of disorder-induced multifractal measures. The typical values of the multifractality exponents can be extracted from calculating the free energy of the associated Statistical Mechanics problem. To succeed in such a calculation we introduce and discuss in some detail two analytically tractable models for logarithmically correlated potentials. The first model uses a special definition of distances between points in space and is based on the idea of multiplicative cascades which originated in theory of turbulent motion. It is essentially equivalent to statistical mechanics of directed polymers on disordered trees studied long ago by Derrida and Spohn (1988) in Ref. [12]. In this way we introduce the notion of the freezing transition which is identified with an abrupt change in the multifractality spectrum. Second model which allows for explicit analytical evaluation of the free energy is the infinite-dimensional version of the problem which can be solved by employing the replica trick. In particular, the latter version allows one to identify the freezing phenomenon with a mechanism of the replica symmetry breaking (RSB) and to elucidate its physical meaning. The corresponding one-step RSB solution turns out to be marginally stable everywhere in the low-temperature phase. We finish with a short discussion of recent developments and extensions of models with logarithmic correlations, in particular in the context of extreme value statistics. The first appendix summarizes the standard elementary information about Gaussian integrals and related subjects, and introduces the notion of the Gaussian free field characterized by logarithmic correlations. Three other appendices provide the detailed exposition of a few technical details underlying the replica analysis of the model discussed in the lectures.

  15. Exploration in free word association networks: models and experiment.

    PubMed

    Ludueña, Guillermo A; Behzad, Mehran Djalali; Gros, Claudius

    2014-05-01

    Free association is a task that requires a subject to express the first word to come to their mind when presented with a certain cue. It is a task which can be used to expose the basic mechanisms by which humans connect memories. In this work, we have made use of a publicly available database of free associations to model the exploration of the averaged network of associations using a statistical and the adaptive control of thought-rational (ACT-R) model. We performed, in addition, an online experiment asking participants to navigate the averaged network using their individual preferences for word associations. We have investigated the statistics of word repetitions in this guided association task. We find that the considered models mimic some of the statistical properties, viz the probability of word repetitions, the distance between repetitions and the distribution of association chain lengths, of the experiment, with the ACT-R model showing a particularly good fit to the experimental data for the more intricate properties as, for instance, the ratio of repetitions per length of association chains.

  16. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  17. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  18. Prediction of the dollar to the ruble rate. A system-theoretic approach

    NASA Astrophysics Data System (ADS)

    Borodachev, Sergey M.

    2017-07-01

    Proposed a simple state-space model of dollar rate formation based on changes in oil prices and some mechanisms of money transfer between monetary and stock markets. Comparison of predictions by means of input-output model and state-space model is made. It concludes that with proper use of statistical data (Kalman filter) the second approach provides more adequate predictions of the dollar rate.

  19. Discrete Model for the Structure and Strength of Cementitious Materials

    NASA Astrophysics Data System (ADS)

    Balopoulos, Victor D.; Archontas, Nikolaos; Pantazopoulou, Stavroula J.

    2017-12-01

    Cementitious materials are characterized by brittle behavior in direct tension and by transverse dilatation (due to microcracking) under compression. Microcracking causes increasingly larger transverse strains and a phenomenological Poisson's ratio that gradually increases to about ν =0.5 and beyond, at the limit point in compression. This behavior is due to the underlying structure of cementitious pastes which is simulated here with a discrete physical model. The computational model is generic, assembled from a statistically generated, continuous network of flaky dendrites consisting of cement hydrates that emanate from partially hydrated cement grains. In the actual amorphous material, the dendrites constitute the solid phase of the cement gel and interconnect to provide the strength and stiffness against load. The idealized dendrite solid is loaded in compression and tension to compute values for strength and Poisson's effects. Parametric studies are conducted, to calibrate the statistical parameters of the discrete model with the physical and mechanical characteristics of the material, so that the familiar experimental trends may be reproduced. The model provides a framework for the study of the mechanical behavior of the material under various states of stress and strain and can be used to model the effects of additives (e.g., fibers) that may be explicitly simulated in the discrete structure.

  20. Dynamically biased statistical model for the ortho/para conversion in the H2 + H3+ → H3+ + H2 reaction.

    PubMed

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-07

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  1. Dynamically biased statistical model for the ortho/para conversion in the H2+H3+ --> H3++ H2 reaction

    NASA Astrophysics Data System (ADS)

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-01

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  2. Understanding the drug release mechanism from a montmorillonite matrix and its binary mixture with a hydrophilic polymer using a compartmental modelling approach

    NASA Astrophysics Data System (ADS)

    Choiri, S.; Ainurofiq, A.

    2018-03-01

    Drug release from a montmorillonite (MMT) matrix is a complex mechanism controlled by swelling mechanism of MMT and an interaction of drug and MMT. The aim of this research was to explain a suitable model of the drug release mechanism from MMT and its binary mixture with a hydrophilic polymer in the controlled release formulation based on a compartmental modelling approach. Theophylline was used as a drug model and incorporated into MMT and a binary mixture with hydroxyl propyl methyl cellulose (HPMC) as a hydrophilic polymer, by a kneading method. The dissolution test was performed and the modelling of drug release was assisted by a WinSAAM software. A 2 model was purposed based on the swelling capability and basal spacing of MMT compartments. The model evaluation was carried out to goodness of fit and statistical parameters and models were validated by a cross-validation technique. The drug release from MMT matrix regulated by a burst release mechanism of unloaded drug, swelling ability, basal spacing of MMT compartment, and equilibrium between basal spacing and swelling compartments. Furthermore, the addition of HPMC in MMT system altered the presence of swelling compartment and equilibrium between swelling and basal spacing compartment systems. In addition, a hydrophilic polymer reduced the burst release mechanism of unloaded drug.

  3. Statistical mechanics of neocortical interactions: A scaling paradigm applied to electroencephalography

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1991-09-01

    A series of papers has developed a statistical mechanics of neocortical interactions (SMNI), deriving aggregate behavior of experimentally observed columns of neurons from statistical electrical-chemical properties of synaptic interactions. While not useful to yield insights at the single-neuron level, SMNI has demonstrated its capability in describing large-scale properties of short-term memory and electroencephalographic (EEG) systematics. The necessity of including nonlinear and stochastic structures in this development has been stressed. In this paper, a more stringent test is placed on SMNI: The algebraic and numerical algorithms previously developed in this and similar systems are brought to bear to fit large sets of EEG and evoked-potential data being collected to investigate genetic predispositions to alcoholism and to extract brain ``signatures'' of short-term memory. Using the numerical algorithm of very fast simulated reannealing, it is demonstrated that SMNI can indeed fit these data within experimentally observed ranges of its underlying neuronal-synaptic parameters, and the quantitative modeling results are used to examine physical neocortical mechanisms to discriminate high-risk and low-risk populations genetically predisposed to alcoholism. Since this study is a control to span relatively long time epochs, similar to earlier attempts to establish such correlations, this discrimination is inconclusive because of other neuronal activity which can mask such effects. However, the SMNI model is shown to be consistent with EEG data during selective attention tasks and with neocortical mechanisms describing short-term memory previously published using this approach. This paper explicitly identifies similar nonlinear stochastic mechanisms of interaction at the microscopic-neuronal, mesoscopic-columnar, and macroscopic-regional scales of neocortical interactions. These results give strong quantitative support for an accurate intuitive picture, portraying neocortical interactions as having common algebraic or physics mechanisms that scale across quite disparate spatial scales and functional or behavioral phenomena, i.e., describing interactions among neurons, columns of neurons, and regional masses of neurons.

  4. Constitutive Modeling, Nonlinear Behavior, and the Stress-Optic Law

    DTIC Science & Technology

    2011-01-01

    estimates of D̂ from dynamic mechanical measurements. Some results are shown in Figure 58 for a filled EPDM rubber [116]. There is rough agreement with...elastomers and filler-reinforced rubber . 5.1 Linearity and the superposition principle The problem of analyzing viscoelastic mechanical behavior is greatly...deformation such as shear. For crosslinked rubber the strain can be defined in terms of the strain function suggested by the statistical theories of

  5. Response statistics of rotating shaft with non-linear elastic restoring forces by path integration

    NASA Astrophysics Data System (ADS)

    Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael

    2017-07-01

    Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.

  6. A Knowledge Generation Model via the Hypernetwork

    PubMed Central

    Liu, Jian-Guo; Yang, Guang-Yong; Hu, Zhao-Long

    2014-01-01

    The influence of the statistical properties of the network on the knowledge diffusion has been extensively studied. However, the structure evolution and the knowledge generation processes are always integrated simultaneously. By introducing the Cobb-Douglas production function and treating the knowledge growth as a cooperative production of knowledge, in this paper, we present two knowledge-generation dynamic evolving models based on different evolving mechanisms. The first model, named “HDPH model,” adopts the hyperedge growth and the hyperdegree preferential attachment mechanisms. The second model, named “KSPH model,” adopts the hyperedge growth and the knowledge stock preferential attachment mechanisms. We investigate the effect of the parameters on the total knowledge stock of the two models. The hyperdegree distribution of the HDPH model can be theoretically analyzed by the mean-field theory. The analytic result indicates that the hyperdegree distribution of the HDPH model obeys the power-law distribution and the exponent is . Furthermore, we present the distributions of the knowledge stock for different parameters . The findings indicate that our proposed models could be helpful for deeply understanding the scientific research cooperation. PMID:24626143

  7. A knowledge generation model via the hypernetwork.

    PubMed

    Liu, Jian-Guo; Yang, Guang-Yong; Hu, Zhao-Long

    2014-01-01

    The influence of the statistical properties of the network on the knowledge diffusion has been extensively studied. However, the structure evolution and the knowledge generation processes are always integrated simultaneously. By introducing the Cobb-Douglas production function and treating the knowledge growth as a cooperative production of knowledge, in this paper, we present two knowledge-generation dynamic evolving models based on different evolving mechanisms. The first model, named "HDPH model," adopts the hyperedge growth and the hyperdegree preferential attachment mechanisms. The second model, named "KSPH model," adopts the hyperedge growth and the knowledge stock preferential attachment mechanisms. We investigate the effect of the parameters (α,β) on the total knowledge stock of the two models. The hyperdegree distribution of the HDPH model can be theoretically analyzed by the mean-field theory. The analytic result indicates that the hyperdegree distribution of the HDPH model obeys the power-law distribution and the exponent is γ = 2 + 1/m. Furthermore, we present the distributions of the knowledge stock for different parameters (α,β). The findings indicate that our proposed models could be helpful for deeply understanding the scientific research cooperation.

  8. Study of pre-seismic kHz EM emissions by means of complex systems

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos

    2010-05-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.

  9. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  10. Neurocognitive mechanisms of statistical-sequential learning: what do event-related potentials tell us?

    PubMed Central

    Daltrozzo, Jerome; Conway, Christopher M.

    2014-01-01

    Statistical-sequential learning (SL) is the ability to process patterns of environmental stimuli, such as spoken language, music, or one’s motor actions, that unfold in time. The underlying neurocognitive mechanisms of SL and the associated cognitive representations are still not well understood as reflected by the heterogeneity of the reviewed cognitive models. The purpose of this review is: (1) to provide a general overview of the primary models and theories of SL, (2) to describe the empirical research – with a focus on the event-related potential (ERP) literature – in support of these models while also highlighting the current limitations of this research, and (3) to present a set of new lines of ERP research to overcome these limitations. The review is articulated around three descriptive dimensions in relation to SL: the level of abstractness of the representations learned through SL, the effect of the level of attention and consciousness on SL, and the developmental trajectory of SL across the life-span. We conclude with a new tentative model that takes into account these three dimensions and also point to several promising new lines of SL research. PMID:24994975

  11. Statistical Model Analysis of (n,p) Cross Sections and Average Energy For Fission Neutron Spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odsuren, M.; Khuukhenkhuu, G.

    2011-06-28

    Investigation of charged particle emission reaction cross sections for fast neutrons is important to both nuclear reactor technology and the understanding of nuclear reaction mechanisms. In particular, the study of (n,p) cross sections is necessary to estimate radiation damage due to hydrogen production, nuclear heating and transmutations in the structural materials of fission and fusion reactors. On the other hand, it is often necessary in practice to evaluate the neutron cross sections of the nuclides for which no experimental data are available.Because of this, we carried out the systematical analysis of known experimental (n,p) and (n,a) cross sections for fastmore » neutrons and observed a systematical regularity in the wide energy interval of 6-20 MeV and for broad mass range of target nuclei. To explain this effect using the compound, pre-equilibrium and direct reaction mechanisms some formulae were deduced. In this paper, in the framework of the statistical model known experimental (n,p) cross sections averaged over the thermal fission neutron spectrum of U-235 are analyzed. It was shown that the experimental data are satisfactorily described by the statistical model. Also, in the case of (n,p) cross sections the effective average neutron energy for fission spectrum of U-235 was found to be around 3 MeV.« less

  12. A first-order statistical smoothing approximation for the coherent wave field in random porous random media

    NASA Astrophysics Data System (ADS)

    Müller, Tobias M.; Gurevich, Boris

    2005-04-01

    An important dissipation mechanism for waves in randomly inhomogeneous poroelastic media is the effect of wave-induced fluid flow. In the framework of Biot's theory of poroelasticity, this mechanism can be understood as scattering from fast into slow compressional waves. To describe this conversion scattering effect in poroelastic random media, the dynamic characteristics of the coherent wavefield using the theory of statistical wave propagation are analyzed. In particular, the method of statistical smoothing is applied to Biot's equations of poroelasticity. Within the accuracy of the first-order statistical smoothing an effective wave number of the coherent field, which accounts for the effect of wave-induced flow, is derived. This wave number is complex and involves an integral over the correlation function of the medium's fluctuations. It is shown that the known one-dimensional (1-D) result can be obtained as a special case of the present 3-D theory. The expression for the effective wave number allows to derive a model for elastic attenuation and dispersion due to wave-induced fluid flow. These wavefield attributes are analyzed in a companion paper. .

  13. A Generalized Quantum Theory

    NASA Astrophysics Data System (ADS)

    Niestegge, Gerd

    2014-09-01

    In quantum mechanics, the selfadjoint Hilbert space operators play a triple role as observables, generators of the dynamical groups and statistical operators defining the mixed states. One might expect that this is typical of Hilbert space quantum mechanics, but it is not. The same triple role occurs for the elements of a certain ordered Banach space in a much more general theory based upon quantum logics and a conditional probability calculus (which is a quantum logical model of the Lueders-von Neumann measurement process). It is shown how positive groups, automorphism groups, Lie algebras and statistical operators emerge from one major postulate - the non-existence of third-order interference (third-order interference and its impossibility in quantum mechanics were discovered by R. Sorkin in 1994). This again underlines the power of the combination of the conditional probability calculus with the postulate that there is no third-order interference. In two earlier papers, its impact on contextuality and nonlocality had already been revealed.

  14. Science and Facebook: The same popularity law!

    PubMed

    Néda, Zoltán; Varga, Levente; Biró, Tamás S

    2017-01-01

    The distribution of scientific citations for publications selected with different rules (author, topic, institution, country, journal, etc…) collapse on a single curve if one plots the citations relative to their mean value. We find that the distribution of "shares" for the Facebook posts rescale in the same manner to the very same curve with scientific citations. This finding suggests that citations are subjected to the same growth mechanism with Facebook popularity measures, being influenced by a statistically similar social environment and selection mechanism. In a simple master-equation approach the exponential growth of the number of publications and a preferential selection mechanism leads to a Tsallis-Pareto distribution offering an excellent description for the observed statistics. Based on our model and on the data derived from PubMed we predict that according to the present trend the average citations per scientific publications exponentially relaxes to about 4.

  15. Science and Facebook: The same popularity law!

    PubMed Central

    Varga, Levente; Biró, Tamás S.

    2017-01-01

    The distribution of scientific citations for publications selected with different rules (author, topic, institution, country, journal, etc…) collapse on a single curve if one plots the citations relative to their mean value. We find that the distribution of “shares” for the Facebook posts rescale in the same manner to the very same curve with scientific citations. This finding suggests that citations are subjected to the same growth mechanism with Facebook popularity measures, being influenced by a statistically similar social environment and selection mechanism. In a simple master-equation approach the exponential growth of the number of publications and a preferential selection mechanism leads to a Tsallis-Pareto distribution offering an excellent description for the observed statistics. Based on our model and on the data derived from PubMed we predict that according to the present trend the average citations per scientific publications exponentially relaxes to about 4. PMID:28678796

  16. Therminator: Configuring the Underlying Statistical Mechanics Model

    DTIC Science & Technology

    2003-12-01

    1502, Oct. 7–10, 2002. [13] Ralph P . Grimaldi , Discrete and Combinational Mathematics, 4th Edition, Addison Wesley Longman, New York, 2000. [14...Eagle Co-Advisor John P . Powers Chairman, Department of Electrical and Computer Engineering Peter J. Denning Chairman, Department of...

  17. Non Kolmogorov Probability Models Outside Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2009-03-01

    This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, "deterministic" and "exact" theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR-chameleon experiment.

  18. Modeling equine race surface vertical mechanical behaviors in a musculoskeletal modeling environment.

    PubMed

    Symons, Jennifer E; Fyhrie, David P; Hawkins, David A; Upadhyaya, Shrinivasa K; Stover, Susan M

    2015-02-26

    Race surfaces have been associated with the incidence of racehorse musculoskeletal injury, the leading cause of racehorse attrition. Optimal race surface mechanical behaviors that minimize injury risk are unknown. Computational models are an economical method to determine optimal mechanical behaviors. Previously developed equine musculoskeletal models utilized ground reaction floor models designed to simulate a stiff, smooth floor appropriate for a human gait laboratory. Our objective was to develop a computational race surface model (two force-displacement functions, one linear and one nonlinear) that reproduced experimental race surface mechanical behaviors for incorporation in equine musculoskeletal models. Soil impact tests were simulated in a musculoskeletal modeling environment and compared to experimental force and displacement data collected during initial and repeat impacts at two racetracks with differing race surfaces - (i) dirt and (ii) synthetic. Best-fit model coefficients (7 total) were compared between surface types and initial and repeat impacts using a mixed model ANCOVA. Model simulation results closely matched empirical force, displacement and velocity data (Mean R(2)=0.930-0.997). Many model coefficients were statistically different between surface types and impacts. Principal component analysis of model coefficients showed systematic differences based on surface type and impact. In the future, the race surface model may be used in conjunction with previously developed the equine musculoskeletal models to understand the effects of race surface mechanical behaviors on limb dynamics, and determine race surface mechanical behaviors that reduce the incidence of racehorse musculoskeletal injury through modulation of limb dynamics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Advanced statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Heron, K. H.

    1994-09-01

    A high-frequency theory (advanced statistical energy analysis (ASEA)) is developed which takes account of the mechanism of tunnelling and uses a ray theory approach to track the power flowing around a plate or a beam network and then uses statistical energy analysis (SEA) to take care of any residual power. ASEA divides the energy of each sub-system into energy that is freely available for transfer to other sub-systems and energy that is fixed within the sub-systems that are physically separate and can be interpreted as a series of mathematical models, the first of which is identical to standard SEA and subsequent higher order models are convergent on an accurate prediction. Using a structural assembly of six rods as an example, ASEA is shown to converge onto the exact results while SEA is shown to overpredict by up to 60 dB.

  20. The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model

    NASA Astrophysics Data System (ADS)

    Verkley, Wim; Severijns, Camiel

    2014-05-01

    Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).

  1. Influence of Adsorption Orientation on the Statistical Mechanics Model of Type I Antifreeze Protein: The Thermal Hysteresis Temperature.

    PubMed

    Li, Li-Fen; Liang, Xi-Xia

    2017-10-19

    The antifreeze activity of type I antifreeze proteins (AFPIs) is studied on the basis of the statistical mechanics theory, by taking the AFP's adsorption orientation into account. The thermal hysteresis temperatures are calculated by determining the system Gibbs function as well as the AFP molecule coverage rate on the ice-crystal surface. The numerical results for the thermal hysteresis temperatures of AFP9, HPLC-6, and AAAA2kE are obtained for both of the cases with and without inclusion of the adsorption orientation. The results show that the influence of the adsorption orientation on the thermal hysteresis temperature cannot be neglected. The theoretical results are coincidental preferably with the experimental data.

  2. A quantitative link between microplastic instability and macroscopic deformation behaviors in metallic glasses

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Chen, G. L.; Hui, X. D.; Liu, C. T.; Lin, Y.; Shang, X. C.; Lu, Z. P.

    2009-10-01

    Based on mechanical instability of individual shear transformation zones (STZs), a quantitative link between the microplastic instability and macroscopic deformation behavior of metallic glasses was proposed. Our analysis confirms that macroscopic metallic glasses comprise a statistical distribution of STZ embryos with distributed values of activation energy, and the microplastic instability of all the individual STZs dictates the macroscopic deformation behavior of amorphous solids. The statistical model presented in this paper can successfully reproduce the macroscopic stress-strain curves determined experimentally and readily be used to predict strain-rate effects on the macroscopic responses with the availability of the material parameters at a certain strain rate, which offer new insights into understanding the actual deformation mechanism in amorphous solids.

  3. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  4. On a full Bayesian inference for force reconstruction problems

    NASA Astrophysics Data System (ADS)

    Aucejo, M.; De Smet, O.

    2018-05-01

    In a previous paper, the authors introduced a flexible methodology for reconstructing mechanical sources in the frequency domain from prior local information on both their nature and location over a linear and time invariant structure. The proposed approach was derived from Bayesian statistics, because of its ability in mathematically accounting for experimenter's prior knowledge. However, since only the Maximum a Posteriori estimate was computed, the posterior uncertainty about the regularized solution given the measured vibration field, the mechanical model and the regularization parameter was not assessed. To answer this legitimate question, this paper fully exploits the Bayesian framework to provide, from a Markov Chain Monte Carlo algorithm, credible intervals and other statistical measures (mean, median, mode) for all the parameters of the force reconstruction problem.

  5. Statistical Mechanics: A Concise Introduction for Chemists (by Benjamin Widom)

    NASA Astrophysics Data System (ADS)

    Kovac, Jeffrey

    2002-11-01

    This book should be in the hands of everyone who teaches undergraduate physical chemistry to provide a model for what can be taught in that course beyond the material contained in the standard textbooks. Graduate students and faculty who need to learn statistical mechanics can hardly find a better introduction. Even those who regularly teach a graduate course in this area will get some new ideas and inspiration from one of the leading practitioners of the field. For completeness, I must add that the book has one weakness. Although there are excellent in-chapter exercises with solutions, there are no end-of-chapter problems. Since there are many sources of good problems, this is a minor flaw in an otherwise wonderful book.

  6. Statistical physics of the symmetric group.

    PubMed

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  7. Statistical physics of the symmetric group

    NASA Astrophysics Data System (ADS)

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  8. Estimating Preferential Flow in Karstic Aquifers Using Statistical Mixed Models

    PubMed Central

    Anaya, Angel A.; Padilla, Ingrid; Macchiavelli, Raul; Vesper, Dorothy J.; Meeker, John D.; Alshawabkeh, Akram N.

    2013-01-01

    Karst aquifers are highly productive groundwater systems often associated with conduit flow. These systems can be highly vulnerable to contamination, resulting in a high potential for contaminant exposure to humans and ecosystems. This work develops statistical models to spatially characterize flow and transport patterns in karstified limestone and determines the effect of aquifer flow rates on these patterns. A laboratory-scale Geo-HydroBed model is used to simulate flow and transport processes in a karstic limestone unit. The model consists of stainless-steel tanks containing a karstified limestone block collected from a karst aquifer formation in northern Puerto Rico. Experimental work involves making a series of flow and tracer injections, while monitoring hydraulic and tracer response spatially and temporally. Statistical mixed models are applied to hydraulic data to determine likely pathways of preferential flow in the limestone units. The models indicate a highly heterogeneous system with dominant, flow-dependent preferential flow regions. Results indicate that regions of preferential flow tend to expand at higher groundwater flow rates, suggesting a greater volume of the system being flushed by flowing water at higher rates. Spatial and temporal distribution of tracer concentrations indicates the presence of conduit-like and diffuse flow transport in the system, supporting the notion of both combined transport mechanisms in the limestone unit. The temporal response of tracer concentrations at different locations in the model coincide with, and confirms the preferential flow distribution generated with the statistical mixed models used in the study. PMID:23802921

  9. Uniting statistical and individual-based approaches for animal movement modelling.

    PubMed

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  10. Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling

    PubMed Central

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047

  11. Emergent kink statistics at finite temperature

    DOE PAGES

    Lopez-Ruiz, Miguel Angel; Yepez-Martinez, Tochtli; Szczepaniak, Adam; ...

    2017-07-25

    In this paper we use 1D quantum mechanical systems with Higgs-like interaction potential to study the emergence of topological objects at finite temperature. Two different model systems are studied, the standard double-well potential model and a newly introduced discrete kink model. Using Monte-Carlo simulations as well as analytic methods, we demonstrate how kinks become abundant at low temperatures. These results may shed useful insights on how topological phenomena may occur in QCD.

  12. Universal Algorithm for Identification of Fractional Brownian Motion. A Case of Telomere Subdiffusion

    PubMed Central

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-01-01

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic—mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. PMID:23199912

  13. The visual system’s internal model of the world

    PubMed Central

    Lee, Tai Sing

    2015-01-01

    The Bayesian paradigm has provided a useful conceptual theory for understanding perceptual computation in the brain. While the detailed neural mechanisms of Bayesian inference are not fully understood, recent computational and neurophysiological works have illuminated the underlying computational principles and representational architecture. The fundamental insights are that the visual system is organized as a modular hierarchy to encode an internal model of the world, and that perception is realized by statistical inference based on such internal model. In this paper, I will discuss and analyze the varieties of representational schemes of these internal models and how they might be used to perform learning and inference. I will argue for a unified theoretical framework for relating the internal models to the observed neural phenomena and mechanisms in the visual cortex. PMID:26566294

  14. Estimation of trends

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.

  15. SurfKin: an ab initio kinetic code for modeling surface reactions.

    PubMed

    Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K

    2014-10-05

    In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.

  16. Cumulative (Dis)Advantage and the Matthew Effect in Life-Course Analysis

    PubMed Central

    Bask, Miia; Bask, Mikael

    2015-01-01

    To foster a deeper understanding of the mechanisms behind inequality in society, it is crucial to work with well-defined concepts associated with such mechanisms. The aim of this paper is to define cumulative (dis)advantage and the Matthew effect. We argue that cumulative (dis)advantage is an intra-individual micro-level phenomenon, that the Matthew effect is an inter-individual macro-level phenomenon and that an appropriate measure of the Matthew effect focuses on the mechanism or dynamic process that generates inequality. The Matthew mechanism is, therefore, a better name for the phenomenon, where we provide a novel measure of the mechanism, including a proof-of-principle analysis using disposable personal income data. Finally, because socio-economic theory should be able to explain cumulative (dis)advantage and the Matthew mechanism when they are detected in data, we discuss the types of models that may explain the phenomena. We argue that interactions-based models in the literature traditions of analytical sociology and statistical mechanics serve this purpose. PMID:26606386

  17. Evaluation of Chest Injury Mechanisms in Nearside Oblique Frontal Impacts

    PubMed Central

    Iraeus, Johan; Lindquist, Mats; Wistrand, Sofie; Sibgård, Elin; Pipkorn, Bengt

    2013-01-01

    Despite the use of seat belts and modern safety systems, many automobile occupants are still seriously injured or killed in car crashes. Common configurations in these crashes are oblique and small overlap frontal impacts that often lead to chest injuries. To evaluate the injury mechanism in these oblique impacts, an investigation was carried out using mathematical human body model simulations. A model of a simplified vehicle interior was developed and validated by means of mechanical sled tests with the Hybrid III dummy. The interior model was then combined with the human body model THUMS and validated by means of mechanical PMHS sled tests. Occupant kinematics as well as rib fracture patterns were predicted with reasonable accuracy. The final model was updated to conform to modern cars and a simulation matrix was run. In this matrix the boundary conditions, ΔV and PDOF, were varied and rib fracture risk as a function of the boundary conditions was evaluated using a statistical framework. In oblique frontal impacts, two injury producing mechanisms were found; (i) diagonal belt load and (ii) side structure impact. The second injury mechanism was found for PDOFs of 25°–35°, depending on ΔV. This means that for larger PDOFs, less ΔV is needed to cause a serious chest injury. PMID:24406957

  18. Capillarity theory for the fly-casting mechanism

    PubMed Central

    Trizac, Emmanuel; Levy, Yaakov; Wolynes, Peter G.

    2010-01-01

    Biomolecular folding and function are often coupled. During molecular recognition events, one of the binding partners may transiently or partially unfold, allowing more rapid access to a binding site. We describe a simple model for this fly-casting mechanism based on the capillarity approximation and polymer chain statistics. The model shows that fly casting is most effective when the protein unfolding barrier is small and the part of the chain which extends toward the target is relatively rigid. These features are often seen in known examples of fly casting in protein–DNA binding. Simulations of protein–DNA binding based on well-funneled native-topology models with electrostatic forces confirm the trends of the analytical theory. PMID:20133683

  19. A detailed heterogeneous agent model for a single asset financial market with trading via an order book.

    PubMed

    Mota Navarro, Roberto; Larralde, Hernán

    2017-01-01

    We present an agent based model of a single asset financial market that is capable of replicating most of the non-trivial statistical properties observed in real financial markets, generically referred to as stylized facts. In our model agents employ strategies inspired on those used in real markets, and a realistic trade mechanism based on a double auction order book. We study the role of the distinct types of trader on the return statistics: specifically, correlation properties (or lack thereof), volatility clustering, heavy tails, and the degree to which the distribution can be described by a log-normal. Further, by introducing the practice of "profit taking", our model is also capable of replicating the stylized fact related to an asymmetry in the distribution of losses and gains.

  20. A canonical neural mechanism for behavioral variability

    PubMed Central

    Darshan, Ran; Wood, William E.; Peters, Susan; Leblois, Arthur; Hansel, David

    2017-01-01

    The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5–6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these ‘universal' statistics. PMID:28530225

  1. A statistic-thermodynamic model for the DOM degradation in the estuary

    NASA Astrophysics Data System (ADS)

    Zheng, Quanan; Chen, Qin; Zhao, Haihong; Shi, Jiuxin; Cao, Yong; Wang, Dan

    2008-03-01

    This study aims to clarify the role of dissolved salts playing in the degradation process of terrestrial dissolved organic matter (DOM) at a scale of molecular movement. The molecular thermal movement is perpetual motion. In a multi-molecular system, this random motion also causes collision between the molecules. Seawater is a multi-molecular system consisting from water, salt, and terrestrial DOM molecules. This study attributes the DOM degradation in the estuary to the inelastic collision of DOM molecule with charged salt ions. From statistic-thermodynamic theories of molecular collision, the DOM degradation model and the DOM distribution model are derived. The models are validated by the field observations and satellite data. Thus, we conclude that the inelastic collision between the terrestrial DOM molecules and dissolved salt ions in seawater is a decisive dynamic mechanism for rapid loss of terrestrial DOM.

  2. A detailed heterogeneous agent model for a single asset financial market with trading via an order book

    PubMed Central

    2017-01-01

    We present an agent based model of a single asset financial market that is capable of replicating most of the non-trivial statistical properties observed in real financial markets, generically referred to as stylized facts. In our model agents employ strategies inspired on those used in real markets, and a realistic trade mechanism based on a double auction order book. We study the role of the distinct types of trader on the return statistics: specifically, correlation properties (or lack thereof), volatility clustering, heavy tails, and the degree to which the distribution can be described by a log-normal. Further, by introducing the practice of “profit taking”, our model is also capable of replicating the stylized fact related to an asymmetry in the distribution of losses and gains. PMID:28245251

  3. The statistical geometry of transcriptome divergence in cell-type evolution and cancer.

    PubMed

    Liang, Cong; Forrest, Alistair R R; Wagner, Günter P

    2015-01-14

    In evolution, body plan complexity increases due to an increase in the number of individualized cell types. Yet, there is very little understanding of the mechanisms that produce this form of organismal complexity. One model for the origin of novel cell types is the sister cell-type model. According to this model, each cell type arises together with a sister cell type through specialization from an ancestral cell type. A key prediction of the sister cell-type model is that gene expression profiles of cell types exhibit tree structure. Here we present a statistical model for detecting tree structure in transcriptomic data and apply it to transcriptomes from ENCODE and FANTOM5. We show that transcriptomes of normal cells harbour substantial amounts of hierarchical structure. In contrast, cancer cell lines have less tree structure, suggesting that the emergence of cancer cells follows different principles from that of evolutionary cell-type origination.

  4. Multiscale volatility duration characteristics on financial multi-continuum percolation dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Min; Wang, Jun

    A random stock price model based on the multi-continuum percolation system is developed to investigate the nonlinear dynamics of stock price volatility duration, in an attempt to explain various statistical facts found in financial data, and have a deeper understanding of mechanisms in the financial market. The continuum percolation system is usually referred to be a random coverage process or a Boolean model, it is a member of a class of statistical physics systems. In this paper, the multi-continuum percolation (with different values of radius) is employed to model and reproduce the dispersal of information among the investors. To testify the rationality of the proposed model, the nonlinear analyses of return volatility duration series are preformed by multifractal detrending moving average analysis and Zipf analysis. The comparison empirical results indicate the similar nonlinear behaviors for the proposed model and the actual Chinese stock market.

  5. The role of selection on evolutionary rescue

    NASA Astrophysics Data System (ADS)

    Amirjanov, Adil

    The paper investigates the role of selection on evolutionary rescue of population. The statistical mechanics technique is used to model dynamics of a population experiencing a natural selection and an abrupt change in the environment. The paper assesses the selective pressure produced by two different mechanisms: by strength of resistance and by strength of selection (by intraspecific competition). It is shown that both mechanisms are capable of providing an evolutionary rescue of population in particular conditions. However, for a small level of an extinction rate, the population cannot be rescued without intraspecific competition.

  6. Statistical mechanical model of coupled transcription from multiple promoters due to transcription factor titration

    PubMed Central

    Rydenfelt, Mattias; Cox, Robert Sidney; Garcia, Hernan; Phillips, Rob

    2014-01-01

    Transcription factors (TFs) with regulatory action at multiple promoter targets is the rule rather than the exception, with examples ranging from the cAMP receptor protein (CRP) in E. coli that regulates hundreds of different genes simultaneously to situations involving multiple copies of the same gene, such as plasmids, retrotransposons, or highly replicated viral DNA. When the number of TFs heavily exceeds the number of binding sites, TF binding to each promoter can be regarded as independent. However, when the number of TF molecules is comparable to the number of binding sites, TF titration will result in correlation (“promoter entanglement”) between transcription of different genes. We develop a statistical mechanical model which takes the TF titration effect into account and use it to predict both the level of gene expression for a general set of promoters and the resulting correlation in transcription rates of different genes. Our results show that the TF titration effect could be important for understanding gene expression in many regulatory settings. PMID:24580252

  7. cgDNAweb: a web interface to the cgDNA sequence-dependent coarse-grain model of double-stranded DNA.

    PubMed

    De Bruin, Lennart; Maddocks, John H

    2018-06-14

    The sequence-dependent statistical mechanical properties of fragments of double-stranded DNA is believed to be pertinent to its biological function at length scales from a few base pairs (or bp) to a few hundreds of bp, e.g. indirect read-out protein binding sites, nucleosome positioning sequences, phased A-tracts, etc. In turn, the equilibrium statistical mechanics behaviour of DNA depends upon its ground state configuration, or minimum free energy shape, as well as on its fluctuations as governed by its stiffness (in an appropriate sense). We here present cgDNAweb, which provides browser-based interactive visualization of the sequence-dependent ground states of double-stranded DNA molecules, as predicted by the underlying cgDNA coarse-grain rigid-base model of fragments with arbitrary sequence. The cgDNAweb interface is specifically designed to facilitate comparison between ground state shapes of different sequences. The server is freely available at cgDNAweb.epfl.ch with no login requirement.

  8. The impact of loss to follow-up on hypothesis tests of the treatment effect for several statistical methods in substance abuse clinical trials.

    PubMed

    Hedden, Sarra L; Woolson, Robert F; Carter, Rickey E; Palesch, Yuko; Upadhyaya, Himanshu P; Malcolm, Robert J

    2009-07-01

    "Loss to follow-up" can be substantial in substance abuse clinical trials. When extensive losses to follow-up occur, one must cautiously analyze and interpret the findings of a research study. Aims of this project were to introduce the types of missing data mechanisms and describe several methods for analyzing data with loss to follow-up. Furthermore, a simulation study compared Type I error and power of several methods when missing data amount and mechanism varies. Methods compared were the following: Last observation carried forward (LOCF), multiple imputation (MI), modified stratified summary statistics (SSS), and mixed effects models. Results demonstrated nominal Type I error for all methods; power was high for all methods except LOCF. Mixed effect model, modified SSS, and MI are generally recommended for use; however, many methods require that the data are missing at random or missing completely at random (i.e., "ignorable"). If the missing data are presumed to be nonignorable, a sensitivity analysis is recommended.

  9. An argument for mechanism-based statistical inference in cancer

    PubMed Central

    Ochs, Michael; Price, Nathan D.; Tomasetti, Cristian; Younes, Laurent

    2015-01-01

    Cancer is perhaps the prototypical systems disease, and as such has been the focus of extensive study in quantitative systems biology. However, translating these programs into personalized clinical care remains elusive and incomplete. In this perspective, we argue that realizing this agenda—in particular, predicting disease phenotypes, progression and treatment response for individuals—requires going well beyond standard computational and bioinformatics tools and algorithms. It entails designing global mathematical models over network-scale configurations of genomic states and molecular concentrations, and learning the model parameters from limited available samples of high-dimensional and integrative omics data. As such, any plausible design should accommodate: biological mechanism, necessary for both feasible learning and interpretable decision making; stochasticity, to deal with uncertainty and observed variation at many scales; and a capacity for statistical inference at the patient level. This program, which requires a close, sustained collaboration between mathematicians and biologists, is illustrated in several contexts, including learning bio-markers, metabolism, cell signaling, network inference and tumorigenesis. PMID:25381197

  10. Genetic algorithm dynamics on a rugged landscape

    NASA Astrophysics Data System (ADS)

    Bornholdt, Stefan

    1998-04-01

    The genetic algorithm is an optimization procedure motivated by biological evolution and is successfully applied to optimization problems in different areas. A statistical mechanics model for its dynamics is proposed based on the parent-child fitness correlation of the genetic operators, making it applicable to general fitness landscapes. It is compared to a recent model based on a maximum entropy ansatz. Finally it is applied to modeling the dynamics of a genetic algorithm on the rugged fitness landscape of the NK model.

  11. A statistical model for investigating binding probabilities of DNA nucleotide sequences using microarrays.

    PubMed

    Lee, Mei-Ling Ting; Bulyk, Martha L; Whitmore, G A; Church, George M

    2002-12-01

    There is considerable scientific interest in knowing the probability that a site-specific transcription factor will bind to a given DNA sequence. Microarray methods provide an effective means for assessing the binding affinities of a large number of DNA sequences as demonstrated by Bulyk et al. (2001, Proceedings of the National Academy of Sciences, USA 98, 7158-7163) in their study of the DNA-binding specificities of Zif268 zinc fingers using microarray technology. In a follow-up investigation, Bulyk, Johnson, and Church (2002, Nucleic Acid Research 30, 1255-1261) studied the interdependence of nucleotides on the binding affinities of transcription proteins. Our article is motivated by this pair of studies. We present a general statistical methodology for analyzing microarray intensity measurements reflecting DNA-protein interactions. The log probability of a protein binding to a DNA sequence on an array is modeled using a linear ANOVA model. This model is convenient because it employs familiar statistical concepts and procedures and also because it is effective for investigating the probability structure of the binding mechanism.

  12. Optimal region of latching activity in an adaptive Potts model for networks of neurons

    NASA Astrophysics Data System (ADS)

    Abdollah-nia, Mohammad-Farshad; Saeedghalati, Mohammadkarim; Abbassian, Abdolhossein

    2012-02-01

    In statistical mechanics, the Potts model is a model for interacting spins with more than two discrete states. Neural networks which exhibit features of learning and associative memory can also be modeled by a system of Potts spins. A spontaneous behavior of hopping from one discrete attractor state to another (referred to as latching) has been proposed to be associated with higher cognitive functions. Here we propose a model in which both the stochastic dynamics of Potts models and an adaptive potential function are present. A latching dynamics is observed in a limited region of the noise(temperature)-adaptation parameter space. We hence suggest noise as a fundamental factor in such alternations alongside adaptation. From a dynamical systems point of view, the noise-adaptation alternations may be the underlying mechanism for multi-stability in attractor-based models. An optimality criterion for realistic models is finally inferred.

  13. Statistical mechanics of neocortical interactions: Constraints on 40-Hz models of short-term memory

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1995-10-01

    Calculations presented in L. Ingber and P.L. Nunez, Phys. Rev. E 51, 5074 (1995) detailed the evolution of short-term memory in the neocortex, supporting the empirical 7+/-2 rule of constraints on the capacity of neocortical processing. These results are given further support when other recent models of 40-Hz subcycles of low-frequency oscillations are considered.

  14. BOOK REVIEW: Statistical Mechanics of Turbulent Flows

    NASA Astrophysics Data System (ADS)

    Cambon, C.

    2004-10-01

    This is a handbook for a computational approach to reacting flows, including background material on statistical mechanics. In this sense, the title is somewhat misleading with respect to other books dedicated to the statistical theory of turbulence (e.g. Monin and Yaglom). In the present book, emphasis is placed on modelling (engineering closures) for computational fluid dynamics. The probabilistic (pdf) approach is applied to the local scalar field, motivated first by the nonlinearity of chemical source terms which appear in the transport equations of reacting species. The probabilistic and stochastic approaches are also used for the velocity field and particle position; nevertheless they are essentially limited to Lagrangian models for a local vector, with only single-point statistics, as for the scalar. Accordingly, conventional techniques, such as single-point closures for RANS (Reynolds-averaged Navier-Stokes) and subgrid-scale models for LES (large-eddy simulations), are described and in some cases reformulated using underlying Langevin models and filtered pdfs. Even if the theoretical approach to turbulence is not discussed in general, the essentials of probabilistic and stochastic-processes methods are described, with a useful reminder concerning statistics at the molecular level. The book comprises 7 chapters. Chapter 1 briefly states the goals and contents, with a very clear synoptic scheme on page 2. Chapter 2 presents definitions and examples of pdfs and related statistical moments. Chapter 3 deals with stochastic processes, pdf transport equations, from Kramer-Moyal to Fokker-Planck (for Markov processes), and moments equations. Stochastic differential equations are introduced and their relationship to pdfs described. This chapter ends with a discussion of stochastic modelling. The equations of fluid mechanics and thermodynamics are addressed in chapter 4. Classical conservation equations (mass, velocity, internal energy) are derived from their counterparts at the molecular level. In addition, equations are given for multicomponent reacting systems. The chapter ends with miscellaneous topics, including DNS, (idea of) the energy cascade, and RANS. Chapter 5 is devoted to stochastic models for the large scales of turbulence. Langevin-type models for velocity (and particle position) are presented, and their various consequences for second-order single-point corelations (Reynolds stress components, Kolmogorov constant) are discussed. These models are then presented for the scalar. The chapter ends with compressible high-speed flows and various models, ranging from k-epsilon to hybrid RANS-pdf. Stochastic models for small-scale turbulence are addressed in chapter 6. These models are based on the concept of a filter density function (FDF) for the scalar, and a more conventional SGS (sub-grid-scale model) for the velocity in LES. The final chapter, chapter 7, is entitled `The unification of turbulence models' and aims at reconciling large-scale and small-scale modelling. This book offers a timely survey of techniques in modern computational fluid mechanics for turbulent flows with reacting scalars. It should be of interest to engineers, while the discussion of the underlying tools, namely pdfs, stochastic and statistical equations should also be attractive to applied mathematicians and physicists. The book's emphasis on local pdfs and stochastic Langevin models gives a consistent structure to the book and allows the author to cover almost the whole spectrum of practical modelling in turbulent CFD. On the other hand, one might regret that non-local issues are not mentioned explicitly, or even briefly. These problems range from the presence of pressure-strain correlations in the Reynolds stress transport equations to the presence of two-point pdfs in the single-point pdf equation derived from the Navier--Stokes equations. (One may recall that, even without scalar transport, a general closure problem for turbulence statistics results from both non-linearity and non-locality of Navier-Stokes equations, the latter coming from, e.g., the nonlocal relationship of velocity and pressure in the quasi-incompressible case. These two aspects are often intricately linked. It is well known that non-linearity alone is not responsible for the `problem', as evidenced by 1D turbulence without pressure (`Burgulence' from the Burgers equation) and probably 3D (cosmological gas). A local description in terms of pdf for the velocity can resolve the `non-linear' problem, which instead yields an infinite hierarchy of equations in terms of moments. On the other hand, non-locality yields a hierarchy of unclosed equations, with the single-point pdf equation for velocity derived from NS incompressible equations involving a two-point pdf, and so on. The general relationship was given by Lundgren (1967, Phys. Fluids 10 (5), 969-975), with the equation for pdf at n points involving the pdf at n+1 points. The nonlocal problem appears in various statistical models which are not discussed in the book. The simplest example is full RST or ASM models, in which the closure of pressure-strain correlations is pivotal (their counterpart ought to be identified and discussed in equations (5-21) and the following ones). The book does not address more sophisticated non-local approaches, such as two-point (or spectral) non-linear closure theories and models, `rapid distortion theory' for linear regimes, not to mention scaling and intermittency based on two-point structure functions, etc. The book sometimes mixes theoretical modelling and pure empirical relationships, the empirical character coming from the lack of a nonlocal (two-point) approach.) In short, the book is orientated more towards applications than towards turbulence theory; it is written clearly and concisely and should be useful to a large community, interested either in the underlying stochastic formalism or in CFD applications.

  15. Occupation times and ergodicity breaking in biased continuous time random walks

    NASA Astrophysics Data System (ADS)

    Bel, Golan; Barkai, Eli

    2005-12-01

    Continuous time random walk (CTRW) models are widely used to model diffusion in condensed matter. There are two classes of such models, distinguished by the convergence or divergence of the mean waiting time. Systems with finite average sojourn time are ergodic and thus Boltzmann-Gibbs statistics can be applied. We investigate the statistical properties of CTRW models with infinite average sojourn time; in particular, the occupation time probability density function is obtained. It is shown that in the non-ergodic phase the distribution of the occupation time of the particle on a given lattice point exhibits bimodal U or trimodal W shape, related to the arcsine law. The key points are as follows. (a) In a CTRW with finite or infinite mean waiting time, the distribution of the number of visits on a lattice point is determined by the probability that a member of an ensemble of particles in equilibrium occupies the lattice point. (b) The asymmetry parameter of the probability distribution function of occupation times is related to the Boltzmann probability and to the partition function. (c) The ensemble average is given by Boltzmann-Gibbs statistics for either finite or infinite mean sojourn time, when detailed balance conditions hold. (d) A non-ergodic generalization of the Boltzmann-Gibbs statistical mechanics for systems with infinite mean sojourn time is found.

  16. Mechanical characterization of structurally porous biomaterials built via additive manufacturing: experiments, predictive models, and design maps for load-bearing bone replacement implants.

    PubMed

    Melancon, D; Bagheri, Z S; Johnston, R B; Liu, L; Tanzer, M; Pasini, D

    2017-11-01

    Porous biomaterials can be additively manufactured with micro-architecture tailored to satisfy the stringent mechano-biological requirements imposed by bone replacement implants. In a previous investigation, we introduced structurally porous biomaterials, featuring strength five times stronger than commercially available porous materials, and confirmed their bone ingrowth capability in an in vivo canine model. While encouraging, the manufactured biomaterials showed geometric mismatches between their internal porous architecture and that of its as-designed counterpart, as well as discrepancies between predicted and tested mechanical properties, issues not fully elucidated. In this work, we propose a systematic approach integrating computed tomography, mechanical testing, and statistical analysis of geometric imperfections to generate statistical based numerical models of high-strength additively manufactured porous biomaterials. The method is used to develop morphology and mechanical maps that illustrate the role played by pore size, porosity, strut thickness, and topology on the relations governing their elastic modulus and compressive yield strength. Overall, there are mismatches between the mechanical properties of ideal-geometry models and as-manufactured porous biomaterials with average errors of 49% and 41% respectively for compressive elastic modulus and yield strength. The proposed methodology gives more accurate predictions for the compressive stiffness and the compressive strength properties with a reduction of the average error to 11% and 7.6%. The implications of the results and the methodology here introduced are discussed in the relevant biomechanical and clinical context, with insight that highlights promises and limitations of additively manufactured porous biomaterials for load-bearing bone replacement implants. In this work, we perform mechanical characterization of load-bearing porous biomaterials for bone replacement over their entire design space. Results capture the shift in geometry and mechanical properties between as-designed and as-manufactured biomaterials induced by additive manufacturing. Characterization of this shift is crucial to ensure appropriate manufacturing of bone replacement implants that enable biological fixation through bone ingrowth as well as mechanical property harmonization with the native bone tissue. In addition, we propose a method to include manufacturing imperfections in the numerical models that can reduce the discrepancy between predicted and tested properties. The results give insight into the use of structurally porous biomaterials for the design and additive fabrication of load-bearing implants for bone replacement. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  17. Evolutionary model selection and parameter estimation for protein-protein interaction network based on differential evolution algorithm

    PubMed Central

    Huang, Lei; Liao, Li; Wu, Cathy H.

    2016-01-01

    Revealing the underlying evolutionary mechanism plays an important role in understanding protein interaction networks in the cell. While many evolutionary models have been proposed, the problem about applying these models to real network data, especially for differentiating which model can better describe evolutionary process for the observed network urgently remains as a challenge. The traditional way is to use a model with presumed parameters to generate a network, and then evaluate the fitness by summary statistics, which however cannot capture the complete network structures information and estimate parameter distribution. In this work we developed a novel method based on Approximate Bayesian Computation and modified Differential Evolution (ABC-DEP) that is capable of conducting model selection and parameter estimation simultaneously and detecting the underlying evolutionary mechanisms more accurately. We tested our method for its power in differentiating models and estimating parameters on the simulated data and found significant improvement in performance benchmark, as compared with a previous method. We further applied our method to real data of protein interaction networks in human and yeast. Our results show Duplication Attachment model as the predominant evolutionary mechanism for human PPI networks and Scale-Free model as the predominant mechanism for yeast PPI networks. PMID:26357273

  18. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    PubMed Central

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-01-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity). PMID:25976626

  19. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity.

    PubMed

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-15

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  20. Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny

    2015-05-01

    Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).

  1. Using Statistical Mechanics and Entropy Principles to Interpret Variability in Power Law Models of the Streamflow Recession

    NASA Astrophysics Data System (ADS)

    Dralle, D.; Karst, N.; Thompson, S. E.

    2015-12-01

    Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the hypothesis that power law streamflow recession dynamics, and their variations, have their origin in the multiple modalities of storage partitioning.

  2. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    PubMed

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  3. Detecting Selection on Protein Stability through Statistical Mechanical Models of Folding and Evolution

    PubMed Central

    Bastolla, Ugo

    2014-01-01

    The properties of biomolecules depend both on physics and on the evolutionary process that formed them. These two points of view produce a powerful synergism. Physics sets the stage and the constraints that molecular evolution has to obey, and evolutionary theory helps in rationalizing the physical properties of biomolecules, including protein folding thermodynamics. To complete the parallelism, protein thermodynamics is founded on the statistical mechanics in the space of protein structures, and molecular evolution can be viewed as statistical mechanics in the space of protein sequences. In this review, we will integrate both points of view, applying them to detecting selection on the stability of the folded state of proteins. We will start discussing positive design, which strengthens the stability of the folded against the unfolded state of proteins. Positive design justifies why statistical potentials for protein folding can be obtained from the frequencies of structural motifs. Stability against unfolding is easier to achieve for longer proteins. On the contrary, negative design, which consists in destabilizing frequently formed misfolded conformations, is more difficult to achieve for longer proteins. The folding rate can be enhanced by strengthening short-range native interactions, but this requirement contrasts with negative design, and evolution has to trade-off between them. Finally, selection can accelerate functional movements by favoring low frequency normal modes of the dynamics of the native state that strongly correlate with the functional conformation change. PMID:24970217

  4. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  5. STARS DO NOT EAT THEIR YOUNG MIGRATING PLANETS: EMPIRICAL CONSTRAINTS ON PLANET MIGRATION HALTING MECHANISMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plavchan, Peter; Bilinski, Christopher

    The discovery of ''hot Jupiters'' very close to their parent stars confirmed that Jovian planets migrate inward via several potential mechanisms. We present empirical constraints on planet migration halting mechanisms. We compute model density functions of close-in exoplanets in the orbital semi-major axis-stellar mass plane to represent planet migration that is halted via several mechanisms, including the interior 1:2 resonance with the magnetospheric disk truncation radius, the interior 1:2 resonance with the dust sublimation radius, and several scenarios for tidal halting. The models differ in the predicted power-law dependence of the exoplanet orbital semi-major axis as a function of stellarmore » mass, and thus we also include a power-law model with the exponent as a free parameter. We use a Bayesian analysis to assess the model success in reproducing empirical distributions of confirmed exoplanets and Kepler candidates that orbit interior to 0.1 AU. Our results confirm a correlation of the halting distance with stellar mass. Tidal halting provides the best fit to the empirical distribution of confirmed Jovian exoplanets at a statistically robust level, consistent with the Kozai mechanism and the spin-orbit misalignment of a substantial fraction of hot Jupiters. We can rule out migration halting at the interior 1:2 resonances with the magnetospheric disk truncation radius and the interior 1:2 resonance with the dust disk sublimation radius, a uniform random distribution, and a distribution with no dependence on stellar mass. Note that our results do not rule out Type-II migration, but rather eliminate the role of a circumstellar disk in stopping exoplanet migration. For Kepler candidates, which have a more restricted range in stellar mass compared to confirmed planets, we are unable to discern between the tidal dissipation and magnetospheric disk truncation braking mechanisms at a statistically significant level. The power-law model favors exponents in the range of 0.38-0.9. This is larger than that predicted for tidal halting (0.23-0.33), which suggests that additional physics may be missing in the tidal halting theory.« less

  6. The Energetic Cost of Walking: A Comparison of Predictive Methods

    PubMed Central

    Kramer, Patricia Ann; Sylvester, Adam D.

    2011-01-01

    Background The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is “best”, but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. Methodology/Principal Findings We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Conclusion Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species. PMID:21731693

  7. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  8. North American Extreme Temperature Events and Related Large Scale Meteorological Patterns: A Review of Statistical Methods, Dynamics, Modeling, and Trends

    NASA Technical Reports Server (NTRS)

    Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.; hide

    2015-01-01

    The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.

  9. Polymer models of interphase chromosomes

    PubMed Central

    Vasquez, Paula A; Bloom, Kerry

    2014-01-01

    Clear organizational patterns on the genome have emerged from the statistics of population studies of fixed cells. However, how these results translate into the dynamics of individual living cells remains unexplored. We use statistical mechanics models derived from polymer physics to inquire into the effects that chromosome properties and dynamics have in the temporal and spatial behavior of the genome. Overall, changes in the properties of individual chains affect the behavior of all other chains in the domain. We explore two modifications of chain behavior: single chain motion and chain-chain interactions. We show that there is not a direct relation between these effects, as increase in motion, doesn’t necessarily translate into an increase on chain interaction. PMID:25482191

  10. Deterministic Impulsive Vacuum Foundations for Quantum-Mechanical Wavefunctions

    NASA Astrophysics Data System (ADS)

    Valentine, John S.

    2013-09-01

    By assuming that a fermion de-constitutes immediately at source, that its constituents, as bosons, propagate uniformly as scalar vacuum terms with phase (radial) symmetry, and that fermions are unique solutions for specific phase conditions, we find a model that self-quantizes matter from continuous waves, unifying bosons and fermion ontologies in a single basis, in a constitution-invariant process. Vacuum energy has a wavefunction context, as a mass-energy term that enables wave collapse and increases its amplitude, with gravitational field as the gradient of the flux density. Gravitational and charge-based force effects emerge as statistics without special treatment. Confinement, entanglement, vacuum statistics, forces, and wavefunction terms emerge from the model's deterministic foundations.

  11. Relative strength of tailor's bunion osteotomies and fixation techniques.

    PubMed

    Haddon, Todd B; LaPointe, Stephan J

    2013-01-01

    A paucity of data is available on the mechanical strength of fifth metatarsal osteotomies. The present study was designed to provide that information. Five osteotomies were mechanically tested to failure using a materials testing machine and compared with an intact fifth metatarsal using a hollow saw bone model with a sample size of 10 for each construct. The osteotomies tested were the distal reverse chevron fixated with a Kirschner wire, the long plantar reverse chevron osteotomy fixated with 2 screws, a mid-diaphyseal sagittal plane osteotomy fixated with 2 screws, the mid-diaphyseal sagittal plane osteotomy fixated with 2 screws, and an additional cerclage wire and a transverse closing wedge osteotomy fixated with a box wire technique. Analysis of variance was performed, resulting in a statistically significant difference among the data at p <.0001. The Tukey-Kramer honestly significant difference with least significant differences was performed post hoc to separate out the pairs at a minimum α of 0.05. The chevron was statistically the strongest construct at 130 N, followed by the long plantar osteotomy at 78 N. The chevron compared well with the control at 114 N, and they both fractured at the proximal model to fixture interface. The other osteotomies were statistically and significantly weaker than both the chevron and the long plantar constructs, with no statistically significant difference among them at 36, 39, and 48 N. In conclusion, the chevron osteotomy was superior in strength to the sagittal and transverse plane osteotomies and similar in strength and failure to the intact model. Copyright © 2013 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  12. An Information Theory Approach to Nonlinear, Nonequilibrium Thermodynamics

    NASA Astrophysics Data System (ADS)

    Rogers, David M.; Beck, Thomas L.; Rempe, Susan B.

    2011-10-01

    Using the problem of ion channel thermodynamics as an example, we illustrate the idea of building up complex thermodynamic models by successively adding physical information. We present a new formulation of information algebra that generalizes methods of both information theory and statistical mechanics. From this foundation we derive a theory for ion channel kinetics, identifying a nonequilibrium `process' free energy functional in addition to the well-known integrated work functionals. The Gibbs-Maxwell relation for the free energy functional is a Green-Kubo relation, applicable arbitrarily far from equilibrium, that captures the effect of non-local and time-dependent behavior from transient thermal and mechanical driving forces. Comparing the physical significance of the Lagrange multipliers to the canonical ensemble suggests definitions of nonequilibrium ensembles at constant capacitance or inductance in addition to constant resistance. Our result is that statistical mechanical descriptions derived from a few primitive algebraic operations on information can be used to create experimentally-relevant and computable models. By construction, these models may use information from more detailed atomistic simulations. Two surprising consequences to be explored in further work are that (in)distinguishability factors are automatically predicted from the problem formulation and that a direct analogue of the second law for thermodynamic entropy production is found by considering information loss in stochastic processes. The information loss identifies a novel contribution from the instantaneous information entropy that ensures non-negative loss.

  13. Truth, models, model sets, AIC, and multimodel inference: a Bayesian perspective

    USGS Publications Warehouse

    Barker, Richard J.; Link, William A.

    2015-01-01

    Statistical inference begins with viewing data as realizations of stochastic processes. Mathematical models provide partial descriptions of these processes; inference is the process of using the data to obtain a more complete description of the stochastic processes. Wildlife and ecological scientists have become increasingly concerned with the conditional nature of model-based inference: what if the model is wrong? Over the last 2 decades, Akaike's Information Criterion (AIC) has been widely and increasingly used in wildlife statistics for 2 related purposes, first for model choice and second to quantify model uncertainty. We argue that for the second of these purposes, the Bayesian paradigm provides the natural framework for describing uncertainty associated with model choice and provides the most easily communicated basis for model weighting. Moreover, Bayesian arguments provide the sole justification for interpreting model weights (including AIC weights) as coherent (mathematically self consistent) model probabilities. This interpretation requires treating the model as an exact description of the data-generating mechanism. We discuss the implications of this assumption, and conclude that more emphasis is needed on model checking to provide confidence in the quality of inference.

  14. Many-Body Localization and Thermalization in Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Nandkishore, Rahul; Huse, David A.

    2015-03-01

    We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.

  15. Collective Behaviors in Spatially Extended Systems with Local Interactions and Synchronous Updating

    NASA Astrophysics Data System (ADS)

    ChatÉ, H.; Manneville, P.

    1992-01-01

    Assessing the extent to which dynamical systems with many degrees of freedom can be described within a thermodynamics formalism is a problem that currently attracts much attention. In this context, synchronously updated regular lattices of identical, chaotic elements with local interactions are promising models for which statistical mechanics may be hoped to provide some insights. This article presents a large class of cellular automata rules and coupled map lattices of the above type in space dimensions d = 2 to 6.Such simple models can be approached by a mean-field approximation which usually reduces the dynamics to that of a map governing the evolution of some extensive density. While this approximation is exact in the d = infty limit, where macroscopic variables must display the time-dependent behavior of the mean-field map, basic intuition from equilibrium statistical mechanics rules out any such behavior in a low-dimensional systems, since it would involve the collective motion of locally disordered elements.The models studied are chosen to be as close as possible to mean-field conditions, i.e., rather high space dimension, large connectivity, and equal-weight coupling between sites. While the mean-field evolution is never observed, a new type of non-trivial collective behavior is found, at odds with the predictions of equilibrium statistical mechanics. Both in the cellular automata models and in the coupled map lattices, macroscopic variables frequently display a non-transient, time-dependent, low-dimensional dynamics emerging out of local disorder. Striking examples are period 3 cycles in two-state cellular automata and a Hopf bifurcation for a d = 5 lattice of coupled logistic maps. An extensive account of the phenomenology is given, including a catalog of behaviors, classification tables for the celular automata rules, and bifurcation diagrams for the coupled map lattices.The observed underlying dynamics is accompanied by an intrinsic quasi-Gaussian noise (stemming from the local disorder) which disappears in the infinite-size limit. The collective behaviors constitute a robust phenomenon, resisting external noise, small changes in the local dynamics, and modifications of the initial and boundary conditions. Synchronous updating, high space dimension and the regularity of connections are shown to be crucial ingredients in the subtle build-up of correlations giving rise to the collective motion. The discussion stresses the need for a theoretical understanding that neither equilibrium statistical mechanics nor higher-order mean-field approximations are able to provide.

  16. Novel Mechanism for Reducing Acute and Chronic Neurodegeneration After Traumatic Brain Injury

    DTIC Science & Technology

    2017-07-01

    glutamate from the brain. Scope: We will test this novel and powerful neuroprotective treatment in a rat model of repetitive mild (concussive) TBIs...variability. 2. Completed statistical analysis of behavioral experiments examining effects of rGOT and rGOT + OxAc on outcome on rotarod and Morris water ...neuroprotective treatment in a rat model of a single moderate TBI and in a rat model of repetitive mild (concussive) TBIs. Outcome measures include blood and

  17. Atomistic Cohesive Zone Models for Interface Decohesion in Metals

    NASA Technical Reports Server (NTRS)

    Yamakov, Vesselin I.; Saether, Erik; Glaessgen, Edward H.

    2009-01-01

    Using a statistical mechanics approach, a cohesive-zone law in the form of a traction-displacement constitutive relationship characterizing the load transfer across the plane of a growing edge crack is extracted from atomistic simulations for use within a continuum finite element model. The methodology for the atomistic derivation of a cohesive-zone law is presented. This procedure can be implemented to build cohesive-zone finite element models for simulating fracture in nanocrystalline or ultrafine grained materials.

  18. A Comparative Data-Based Modeling Study on Respiratory CO2 Gas Exchange during Mechanical Ventilation

    PubMed Central

    Kim, Chang-Sei; Ansermino, J. Mark; Hahn, Jin-Oh

    2016-01-01

    The goal of this study is to derive a minimally complex but credible model of respiratory CO2 gas exchange that may be used in systematic design and pilot testing of closed-loop end-tidal CO2 controllers in mechanical ventilation. We first derived a candidate model that captures the essential mechanisms involved in the respiratory CO2 gas exchange process. Then, we simplified the candidate model to derive two lower-order candidate models. We compared these candidate models for predictive capability and reliability using experimental data collected from 25 pediatric subjects undergoing dynamically varying mechanical ventilation during surgical procedures. A two-compartment model equipped with transport delay to account for CO2 delivery between the lungs and the tissues showed modest but statistically significant improvement in predictive capability over the same model without transport delay. Aggregating the lungs and the tissues into a single compartment further degraded the predictive fidelity of the model. In addition, the model equipped with transport delay demonstrated superior reliability to the one without transport delay. Further, the respiratory parameters derived from the model equipped with transport delay, but not the one without transport delay, were physiologically plausible. The results suggest that gas transport between the lungs and the tissues must be taken into account to accurately reproduce the respiratory CO2 gas exchange process under conditions of wide-ranging and dynamically varying mechanical ventilation conditions. PMID:26870728

  19. Molecular vibrational energy flow

    NASA Astrophysics Data System (ADS)

    Gruebele, M.; Bigwood, R.

    This article reviews some recent work in molecular vibrational energy flow (IVR), with emphasis on our own computational and experimental studies. We consider the problem in various representations, and use these to develop a family of simple models which combine specific molecular properties (e.g. size, vibrational frequencies) with statistical properties of the potential energy surface and wavefunctions. This marriage of molecular detail and statistical simplification captures trends of IVR mechanisms and survival probabilities beyond the abilities of purely statistical models or the computational limitations of full ab initio approaches. Of particular interest is IVR in the intermediate time regime, where heavy-atom skeletal modes take over the IVR process from hydrogenic motions even upon X H bond excitation. Experiments and calculations on prototype heavy-atom systems show that intermediate time IVR differs in many aspects from the early stages of hydrogenic mode IVR. As a result, IVR can be coherently frozen, with potential applications to selective chemistry.

  20. Statistical text classifier to detect specific type of medical incidents.

    PubMed

    Wong, Zoie Shui-Yee; Akiyama, Masanori

    2013-01-01

    WHO Patient Safety has put focus to increase the coherence and expressiveness of patient safety classification with the foundation of International Classification for Patient Safety (ICPS). Text classification and statistical approaches has showed to be successful to identifysafety problems in the Aviation industryusing incident text information. It has been challenging to comprehend the taxonomy of medical incidents in a structured manner. Independent reporting mechanisms for patient safety incidents have been established in the UK, Canada, Australia, Japan, Hong Kong etc. This research demonstrates the potential to construct statistical text classifiers to detect specific type of medical incidents using incident text data. An illustrative example for classifying look-alike sound-alike (LASA) medication incidents using structured text from 227 advisories related to medication errors from Global Patient Safety Alerts (GPSA) is shown in this poster presentation. The classifier was built using logistic regression model. ROC curve and the AUC value indicated that this is a satisfactory good model.

  1. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  2. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  3. Methods of comparing associative models and an application to retrospective revaluation.

    PubMed

    Witnauer, James E; Hutchings, Ryan; Miller, Ralph R

    2017-11-01

    Contemporary theories of associative learning are increasingly complex, which necessitates the use of computational methods to reveal predictions of these models. We argue that comparisons across multiple models in terms of goodness of fit to empirical data from experiments often reveal more about the actual mechanisms of learning and behavior than do simulations of only a single model. Such comparisons are best made when the values of free parameters are discovered through some optimization procedure based on the specific data being fit (e.g., hill climbing), so that the comparisons hinge on the psychological mechanisms assumed by each model rather than being biased by using parameters that differ in quality across models with respect to the data being fit. Statistics like the Bayesian information criterion facilitate comparisons among models that have different numbers of free parameters. These issues are examined using retrospective revaluation data. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.

    PubMed

    O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao

    2017-07-01

    Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.

  5. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  6. Range of interaction in an opinion evolution model of ideological self-positioning: Contagion, hesitance and polarization

    NASA Astrophysics Data System (ADS)

    Gimenez, M. Cecilia; Paz García, Ana Pamela; Burgos Paci, Maxi A.; Reinaudi, Luis

    2016-04-01

    The evolution of public opinion using tools and concepts borrowed from Statistical Physics is an emerging area within the field of Sociophysics. In the present paper, a Statistical Physics model was developed to study the evolution of the ideological self-positioning of an ensemble of agents. The model consists of an array of L components, each one of which represents the ideology of an agent. The proposed mechanism is based on the ;voter model;, in which one agent can adopt the opinion of another one if the difference of their opinions lies within a certain range. The existence of ;undecided; agents (i.e. agents with no definite opinion) was implemented in the model. The possibility of radicalization of an agent's opinion upon interaction with another one was also implemented. The results of our simulations are compared to statistical data taken from the Latinobarómetro databank for the cases of Argentina, Chile, Brazil and Uruguay in the last decade. Among other results, the effect of taking into account the undecided agents is the formation of a single peak at the middle of the ideological spectrum (which corresponds to a centrist ideological position), in agreement with the real cases studied.

  7. Statistical mechanics of unsupervised feature learning in a restricted Boltzmann machine with binary synapses

    NASA Astrophysics Data System (ADS)

    Huang, Haiping

    2017-05-01

    Revealing hidden features in unlabeled data is called unsupervised feature learning, which plays an important role in pretraining a deep neural network. Here we provide a statistical mechanics analysis of the unsupervised learning in a restricted Boltzmann machine with binary synapses. A message passing equation to infer the hidden feature is derived, and furthermore, variants of this equation are analyzed. A statistical analysis by replica theory describes the thermodynamic properties of the model. Our analysis confirms an entropy crisis preceding the non-convergence of the message passing equation, suggesting a discontinuous phase transition as a key characteristic of the restricted Boltzmann machine. Continuous phase transition is also confirmed depending on the embedded feature strength in the data. The mean-field result under the replica symmetric assumption agrees with that obtained by running message passing algorithms on single instances of finite sizes. Interestingly, in an approximate Hopfield model, the entropy crisis is absent, and a continuous phase transition is observed instead. We also develop an iterative equation to infer the hyper-parameter (temperature) hidden in the data, which in physics corresponds to iteratively imposing Nishimori condition. Our study provides insights towards understanding the thermodynamic properties of the restricted Boltzmann machine learning, and moreover important theoretical basis to build simplified deep networks.

  8. Moral foundations in an interacting neural networks society: A statistical mechanics analysis

    NASA Astrophysics Data System (ADS)

    Vicente, R.; Susemihl, A.; Jericó, J. P.; Caticha, N.

    2014-04-01

    The moral foundations theory supports that people, across cultures, tend to consider a small number of dimensions when classifying issues on a moral basis. The data also show that the statistics of weights attributed to each moral dimension is related to self-declared political affiliation, which in turn has been connected to cognitive learning styles by the recent literature in neuroscience and psychology. Inspired by these data, we propose a simple statistical mechanics model with interacting neural networks classifying vectors and learning from members of their social neighbourhood about their average opinion on a large set of issues. The purpose of learning is to reduce dissension among agents when disagreeing. We consider a family of learning algorithms parametrized by δ, that represents the importance given to corroborating (same sign) opinions. We define an order parameter that quantifies the diversity of opinions in a group with homogeneous learning style. Using Monte Carlo simulations and a mean field approximation we find the relation between the order parameter and the learning parameter δ at a temperature we associate with the importance of social influence in a given group. In concordance with data, groups that rely more strongly on corroborating evidence sustain less opinion diversity. We discuss predictions of the model and propose possible experimental tests.

  9. Statistical mechanical estimation of the free energy of formation of E. coli biomass for use with macroscopic bioreactor balances.

    PubMed

    Grosz, R; Stephanopoulos, G

    1983-09-01

    The need for the determination of the free energy of formation of biomass in bioreactor second law balances is well established. A statistical mechanical method for the calculation of the free energy of formation of E. coli biomass is introduced. In this method, biomass is modelled to consist of a system of biopolymer networks. The partition function of this system is proposed to consist of acoustic and optical modes of vibration. Acoustic modes are described by Tarasov's model, the parameters of which are evaluated with the aid of low-temperature calorimetric data for the crystalline protein bovine chymotrypsinogen A. The optical modes are described by considering the low-temperature thermodynamic properties of biological monomer crystals such as amino acid crystals. Upper and lower bounds are placed on the entropy to establish the maximum error associated with the statistical method. The upper bound is determined by endowing the monomers in biomass with ideal gas properties. The lower bound is obtained by limiting the monomers to complete immobility. On this basis, the free energy of formation is fixed to within 10%. Proposals are made with regard to experimental verification of the calculated value and extension of the calculation to other types of biomass.

  10. Sources and characteristics of acoustic emissions from mechanically stressed geologic granular media — A review

    NASA Astrophysics Data System (ADS)

    Michlmayr, Gernot; Cohen, Denis; Or, Dani

    2012-05-01

    The formation of cracks and emergence of shearing planes and other modes of rapid macroscopic failure in geologic granular media involve numerous grain scale mechanical interactions often generating high frequency (kHz) elastic waves, referred to as acoustic emissions (AE). These acoustic signals have been used primarily for monitoring and characterizing fatigue and progressive failure in engineered systems, with only a few applications concerning geologic granular media reported in the literature. Similar to the monitoring of seismic events preceding an earthquake, AE may offer a means for non-invasive, in-situ, assessment of mechanical precursors associated with imminent landslides or other types of rapid mass movements (debris flows, rock falls, snow avalanches, glacier stick-slip events). Despite diverse applications and potential usefulness, a systematic description of the AE method and its relevance to mechanical processes in Earth sciences is lacking. This review is aimed at providing a sound foundation for linking observed AE with various micro-mechanical failure events in geologic granular materials, not only for monitoring of triggering events preceding mass mobilization, but also as a non-invasive tool in its own right for probing the rich spectrum of mechanical processes at scales ranging from a single grain to a hillslope. We review first studies reporting use of AE for monitoring of failure in various geologic materials, and describe AE generating source mechanisms in mechanically stressed geologic media (e.g., frictional sliding, micro-crackling, particle collisions, rupture of water bridges, etc.) including AE statistical features, such as frequency content and occurrence probabilities. We summarize available AE sensors and measurement principles. The high sampling rates of advanced AE systems enable detection of numerous discrete failure events within a volume and thus provide access to statistical descriptions of progressive collapse of systems with many interacting mechanical elements such as the fiber bundle model (FBM). We highlight intrinsic links between AE characteristics and established statistical models often used in structural engineering and material sciences, and outline potential applications for failure prediction and early-warning using the AE method in combination with the FBM. The biggest challenge to application of the AE method for field applications is strong signal attenuation. We provide an outlook for overcoming such limitations considering emergence of a class of fiber-optic based distributed AE sensors and deployment of acoustic waveguides as part of monitoring networks.

  11. In silico environmental chemical science: properties and processes from statistical and computational modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tratnyek, Paul G.; Bylaska, Eric J.; Weber, Eric J.

    2017-01-01

    Quantitative structure–activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with “in silico” results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs usingmore » descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for “in silico environmental chemical science” are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.« less

  12. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  13. Addressing issues associated with evaluating prediction models for survival endpoints based on the concordance statistic.

    PubMed

    Wang, Ming; Long, Qi

    2016-09-01

    Prediction models for disease risk and prognosis play an important role in biomedical research, and evaluating their predictive accuracy in the presence of censored data is of substantial interest. The standard concordance (c) statistic has been extended to provide a summary measure of predictive accuracy for survival models. Motivated by a prostate cancer study, we address several issues associated with evaluating survival prediction models based on c-statistic with a focus on estimators using the technique of inverse probability of censoring weighting (IPCW). Compared to the existing work, we provide complete results on the asymptotic properties of the IPCW estimators under the assumption of coarsening at random (CAR), and propose a sensitivity analysis under the mechanism of noncoarsening at random (NCAR). In addition, we extend the IPCW approach as well as the sensitivity analysis to high-dimensional settings. The predictive accuracy of prediction models for cancer recurrence after prostatectomy is assessed by applying the proposed approaches. We find that the estimated predictive accuracy for the models in consideration is sensitive to NCAR assumption, and thus identify the best predictive model. Finally, we further evaluate the performance of the proposed methods in both settings of low-dimensional and high-dimensional data under CAR and NCAR through simulations. © 2016, The International Biometric Society.

  14. Statistical mechanics and scaling of fault populations with increasing strain in the Corinth Rift

    NASA Astrophysics Data System (ADS)

    Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter

    2015-12-01

    Scaling properties of fracture/fault systems are studied in order to characterize the mechanical properties of rocks and to provide insight into the mechanisms that govern fault growth. A comprehensive image of the fault network in the Corinth Rift, Greece, obtained through numerous field studies and marine geophysical surveys, allows for the first time such a study over the entire area of the Rift. We compile a detailed fault map of the area and analyze the scaling properties of fault trace-lengths by using a statistical mechanics model, derived in the framework of generalized statistical mechanics and associated maximum entropy principle. By using this framework, a range of asymptotic power-law to exponential-like distributions are derived that can well describe the observed scaling patterns of fault trace-lengths in the Rift. Systematic variations and in particular a transition from asymptotic power-law to exponential-like scaling are observed to be a function of increasing strain in distinct strain regimes in the Rift, providing quantitative evidence for such crustal processes in a single tectonic setting. These results indicate the organization of the fault system as a function of brittle strain in the Earth's crust and suggest there are different mechanisms for fault growth in the distinct parts of the Rift. In addition, other factors such as fault interactions and the thickness of the brittle layer affect how the fault system evolves in time. The results suggest that regional strain, fault interactions and the boundary condition of the brittle layer may control fault growth and the fault network evolution in the Corinth Rift.

  15. Application of activated barrier hopping theory to viscoplastic modeling of glassy polymers

    NASA Astrophysics Data System (ADS)

    Sweeney, J.; Spencer, P. E.; Vgenopoulos, D.; Babenko, M.; Boutenel, F.; Caton-Rose, P.; Coates, P. D.

    2018-05-01

    An established statistical mechanical theory of amorphous polymer deformation has been incorporated as a plastic mechanism into a constitutive model and applied to a range of polymer mechanical deformations. The temperature and rate dependence of the tensile yield of PVC, as reported in early studies, has been modeled to high levels of accuracy. Tensile experiments on PET reported here are analyzed similarly and good accuracy is also achieved. The frequently observed increase in the gradient of the plot of yield stress against logarithm of strain rate is an inherent feature of the constitutive model. The form of temperature dependence of the yield that is predicted by the model is found to give an accurate representation. The constitutive model is developed in two-dimensional form and implemented as a user-defined subroutine in the finite element package ABAQUS. This analysis is applied to the tensile experiments on PET, in some of which strain is localized in the form of shear bands and necks. These deformations are modeled with partial success, though adiabatic heating of the instability causes inaccuracies for this isothermal implementation of the model. The plastic mechanism has advantages over the Eyring process, is equally tractable, and presents no particular difficulties in implementation with finite elements.

  16. Application of activated barrier hopping theory to viscoplastic modeling of glassy polymers

    NASA Astrophysics Data System (ADS)

    Sweeney, J.; Spencer, P. E.; Vgenopoulos, D.; Babenko, M.; Boutenel, F.; Caton-Rose, P.; Coates, P. D.

    2017-10-01

    An established statistical mechanical theory of amorphous polymer deformation has been incorporated as a plastic mechanism into a constitutive model and applied to a range of polymer mechanical deformations. The temperature and rate dependence of the tensile yield of PVC, as reported in early studies, has been modeled to high levels of accuracy. Tensile experiments on PET reported here are analyzed similarly and good accuracy is also achieved. The frequently observed increase in the gradient of the plot of yield stress against logarithm of strain rate is an inherent feature of the constitutive model. The form of temperature dependence of the yield that is predicted by the model is found to give an accurate representation. The constitutive model is developed in two-dimensional form and implemented as a user-defined subroutine in the finite element package ABAQUS. This analysis is applied to the tensile experiments on PET, in some of which strain is localized in the form of shear bands and necks. These deformations are modeled with partial success, though adiabatic heating of the instability causes inaccuracies for this isothermal implementation of the model. The plastic mechanism has advantages over the Eyring process, is equally tractable, and presents no particular difficulties in implementation with finite elements.

  17. Fit reduced GUTS models online: From theory to practice.

    PubMed

    Baudrot, Virgile; Veber, Philippe; Gence, Guillaume; Charles, Sandrine

    2018-05-20

    Mechanistic modeling approaches, such as the toxicokinetic-toxicodynamic (TKTD) framework, are promoted by international institutions such as the European Food Safety Authority and the Organization for Economic Cooperation and Development to assess the environmental risk of chemical products generated by human activities. TKTD models can encompass a large set of mechanisms describing the kinetics of compounds inside organisms (e.g., uptake and elimination) and their effect at the level of individuals (e.g., damage accrual, recovery, and death mechanism). Compared to classical dose-response models, TKTD approaches have many advantages, including accounting for temporal aspects of exposure and toxicity, considering data points all along the experiment and not only at the end, and making predictions for untested situations as realistic exposure scenarios. Among TKTD models, the general unified threshold model of survival (GUTS) is within the most recent and innovative framework but is still underused in practice, especially by risk assessors, because specialist programming and statistical skills are necessary to run it. Making GUTS models easier to use through a new module freely available from the web platform MOSAIC (standing for MOdeling and StAtistical tools for ecotoxIClogy) should promote GUTS operability in support of the daily work of environmental risk assessors. This paper presents the main features of MOSAIC_GUTS: uploading of the experimental data, GUTS fitting analysis, and LCx estimates with their uncertainty. These features will be exemplified from literature data. Integr Environ Assess Manag 2018;00:000-000. © 2018 SETAC. © 2018 SETAC.

  18. Statistical Mechanics-Inspired Modeling of Heterogeneous Packet Transmission in Communication Networks

    DTIC Science & Technology

    2012-08-01

    Heterogeneous Packet Transmission in Communication Networks Soumik Sarkar, Member, IEEE, Kushal Mukherjee,Member, IEEE, Asok Ray , Fellow, IEEE...Dr. Mukherjee is a member of the American Society of Mechanical Engineers. Asok Ray (SM’83–F’02) received the graduate de- grees in each discipline of...United Technologies Research Center, East Hartford, CT 06108 USA (e-mail: sarkars@utrc.utc.com; srivasa1@utrc.utc.com). K. Mukherjee and A. Ray are with

  19. Statistical Mechanical Derivation of Jarzynski's Identity for Thermostated Non-Hamiltonian Dynamics

    NASA Astrophysics Data System (ADS)

    Cuendet, Michel A.

    2006-03-01

    The recent Jarzynski identity (JI) relates thermodynamic free energy differences to nonequilibrium work averages. Several proofs of the JI have been provided on the thermodynamic level. They rely on assumptions such as equivalence of ensembles in the thermodynamic limit or weakly coupled infinite heat baths. However, the JI is widely applied to NVT computer simulations involving finite numbers of particles, whose equations of motion are strongly coupled to a few extra degrees of freedom modeling a thermostat. In this case, the above assumptions are no longer valid. We propose a statistical mechanical approach to the JI solely based on the specific equations of motion, without any further assumption. We provide a detailed derivation for the non-Hamiltonian Nosé-Hoover dynamics, which is routinely used in computer simulations to produce canonical sampling.

  20. A statistical approach to quasi-extinction forecasting.

    PubMed

    Holmes, Elizabeth Eli; Sabo, John L; Viscido, Steven Vincent; Fagan, William Fredric

    2007-12-01

    Forecasting population decline to a certain critical threshold (the quasi-extinction risk) is one of the central objectives of population viability analysis (PVA), and such predictions figure prominently in the decisions of major conservation organizations. In this paper, we argue that accurate forecasting of a population's quasi-extinction risk does not necessarily require knowledge of the underlying biological mechanisms. Because of the stochastic and multiplicative nature of population growth, the ensemble behaviour of population trajectories converges to common statistical forms across a wide variety of stochastic population processes. This paper provides a theoretical basis for this argument. We show that the quasi-extinction surfaces of a variety of complex stochastic population processes (including age-structured, density-dependent and spatially structured populations) can be modelled by a simple stochastic approximation: the stochastic exponential growth process overlaid with Gaussian errors. Using simulated and real data, we show that this model can be estimated with 20-30 years of data and can provide relatively unbiased quasi-extinction risk with confidence intervals considerably smaller than (0,1). This was found to be true even for simulated data derived from some of the noisiest population processes (density-dependent feedback, species interactions and strong age-structure cycling). A key advantage of statistical models is that their parameters and the uncertainty of those parameters can be estimated from time series data using standard statistical methods. In contrast for most species of conservation concern, biologically realistic models must often be specified rather than estimated because of the limited data available for all the various parameters. Biologically realistic models will always have a prominent place in PVA for evaluating specific management options which affect a single segment of a population, a single demographic rate, or different geographic areas. However, for forecasting quasi-extinction risk, statistical models that are based on the convergent statistical properties of population processes offer many advantages over biologically realistic models.

  1. Nonlinear dynamic mechanism of vocal tremor from voice analysis and model simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Jiang, Jack J.

    2008-09-01

    Nonlinear dynamic analysis and model simulations are used to study the nonlinear dynamic characteristics of vocal folds with vocal tremor, which can typically be characterized by low-frequency modulation and aperiodicity. Tremor voices from patients with disorders such as paresis, Parkinson's disease, hyperfunction, and adductor spasmodic dysphonia show low-dimensional characteristics, differing from random noise. Correlation dimension analysis statistically distinguishes tremor voices from normal voices. Furthermore, a nonlinear tremor model is proposed to study the vibrations of the vocal folds with vocal tremor. Fractal dimensions and positive Lyapunov exponents demonstrate the evidence of chaos in the tremor model, where amplitude and frequency play important roles in governing vocal fold dynamics. Nonlinear dynamic voice analysis and vocal fold modeling may provide a useful set of tools for understanding the dynamic mechanism of vocal tremor in patients with laryngeal diseases.

  2. A new fracture mechanics model for multiple matrix cracks of SiC fiber reinforced brittle-matrix composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okabe, T.; Takeda, N.; Komotori, J.

    1999-11-26

    A new model is proposed for multiple matrix cracking in order to take into account the role of matrix-rich regions in the cross section in initiating crack growth. The model is used to predict the matrix cracking stress and the total number of matrix cracks. The model converts the matrix-rich regions into equivalent penny shape crack sizes and predicts the matrix cracking stress with a fracture mechanics crack-bridging model. The estimated distribution of matrix cracking stresses is used as statistical input to predict the number of matrix cracks. The results show good agreement with the experimental results by replica observations.more » Therefore, it is found that the matrix cracking behavior mainly depends on the distribution of matrix-rich regions in the composite.« less

  3. Eigenvector centrality is a metric of elastomer modulus, heterogeneity, and damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, Jr., Paul Michael; Welch, Cynthia F.

    Here, we present an application of eigenvector centrality to encode the connectivity of polymer networks resolved at the micro- and meso-scopic length scales. This method captures the relative importance of different nodes within the network structure and provides a route toward the development of a statistical mechanics model that correlates connectivity with mechanical response. This scheme may be informed by analytical and semi-analytical models for the network structure, or through direct experimental examination. It may be used to predict the reduction in mechanical performance for heterogeneous materials subjected to specific modes of damage. Here, we develop the method and demonstratemore » that it leads to the prediction of established trends in elastomers. We also apply the model to the case of a self-healing polymer network reported in the literature, extracting insight about the fraction of bonds broken and re-formed during strain and recovery.« less

  4. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  5. Eigenvector centrality is a metric of elastomer modulus, heterogeneity, and damage

    DOE PAGES

    Welch, Jr., Paul Michael; Welch, Cynthia F.

    2017-04-27

    Here, we present an application of eigenvector centrality to encode the connectivity of polymer networks resolved at the micro- and meso-scopic length scales. This method captures the relative importance of different nodes within the network structure and provides a route toward the development of a statistical mechanics model that correlates connectivity with mechanical response. This scheme may be informed by analytical and semi-analytical models for the network structure, or through direct experimental examination. It may be used to predict the reduction in mechanical performance for heterogeneous materials subjected to specific modes of damage. Here, we develop the method and demonstratemore » that it leads to the prediction of established trends in elastomers. We also apply the model to the case of a self-healing polymer network reported in the literature, extracting insight about the fraction of bonds broken and re-formed during strain and recovery.« less

  6. Hydrologic controls on aperiodic spatial organization of the ridge-slough patterned landscape

    NASA Astrophysics Data System (ADS)

    Casey, Stephen T.; Cohen, Matthew J.; Acharya, Subodh; Kaplan, David A.; Jawitz, James W.

    2016-11-01

    A century of hydrologic modification has altered the physical and biological drivers of landscape processes in the Everglades (Florida, USA). Restoring the ridge-slough patterned landscape, a dominant feature of the historical system, is a priority but requires an understanding of pattern genesis and degradation mechanisms. Physical experiments to evaluate alternative pattern formation mechanisms are limited by the long timescales of peat accumulation and loss, necessitating model-based comparisons, where support for a particular mechanism is based on model replication of extant patterning and trajectories of degradation. However, multiple mechanisms yield a central feature of ridge-slough patterning (patch elongation in the direction of historical flow), limiting the utility of that characteristic for discriminating among alternatives. Using data from vegetation maps, we investigated the statistical features of ridge-slough spatial patterning (ridge density, patch perimeter, elongation, patch size distributions, and spatial periodicity) to establish more rigorous criteria for evaluating model performance and to inform controls on pattern variation across the contemporary system. Mean water depth explained significant variation in ridge density, total perimeter, and length : width ratios, illustrating an important pattern response to existing hydrologic gradients. Two independent analyses (2-D periodograms and patch size distributions) provide strong evidence against regular patterning, with the landscape exhibiting neither a characteristic wavelength nor a characteristic patch size, both of which are expected under conditions that produce regular patterns. Rather, landscape properties suggest robust scale-free patterning, indicating genesis from the coupled effects of local facilitation and a global negative feedback operating uniformly at the landscape scale. Critically, this challenges widespread invocation of scale-dependent negative feedbacks for explaining ridge-slough pattern origins. These results help discern among genesis mechanisms and provide an improved statistical description of the landscape that can be used to compare among model outputs, as well as to assess the success of future restoration projects.

  7. Beyond δ : Tailoring marked statistics to reveal modified gravity

    NASA Astrophysics Data System (ADS)

    Valogiannis, Georgios; Bean, Rachel

    2018-01-01

    Models that seek to explain cosmic acceleration through modifications to general relativity (GR) evade stringent Solar System constraints through a restoring, screening mechanism. Down-weighting the high-density, screened regions in favor of the low density, unscreened ones offers the potential to enhance the amount of information carried in such modified gravity models. In this work, we assess the performance of a new "marked" transformation and perform a systematic comparison with the clipping and logarithmic transformations, in the context of Λ CDM and the symmetron and f (R ) modified gravity models. Performance is measured in terms of the fractional boost in the Fisher information and the signal-to-noise ratio (SNR) for these models relative to the statistics derived from the standard density distribution. We find that all three statistics provide improved Fisher boosts over the basic density statistics. The model parameters for the marked and clipped transformation that best enhance signals and the Fisher boosts are determined. We also show that the mark is useful both as a Fourier and real-space transformation; a marked correlation function also enhances the SNR relative to the standard correlation function, and can on mildly nonlinear scales show a significant difference between the Λ CDM and the modified gravity models. Our results demonstrate how a series of simple analytical transformations could dramatically increase the predicted information extracted on deviations from GR, from large-scale surveys, and give the prospect for a much more feasible potential detection.

  8. Quantum Statistical Mechanics on a Quantum Computer

    NASA Astrophysics Data System (ADS)

    Raedt, H. D.; Hams, A. H.; Michielsen, K.; Miyashita, S.; Saito, K.

    We describe a quantum algorithm to compute the density of states and thermal equilibrium properties of quantum many-body systems. We present results obtained by running this algorithm on a software implementation of a 21-qubit quantum computer for the case of an antiferromagnetic Heisenberg model on triangular lattices of different size.

  9. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  10. Is quantum theory a form of statistical mechanics?

    NASA Astrophysics Data System (ADS)

    Adler, S. L.

    2007-05-01

    We give a review of the basic themes of my recent book: Adler S L 2004 Quantum Theory as an Emergent Phenomenon (Cambridge: Cambridge University Press). We first give motivations for considering the possibility that quantum mechanics is not exact, but is instead an accurate asymptotic approximation to a deeper level theory. For this deeper level, we propose a non-commutative generalization of classical mechanics, that we call "trace dynamics", and we give a brief survey of how it works, considering for simplicity only the bosonic case. We then discuss the statistical mechanics of trace dynamics and give our argument that with suitable approximations, the Ward identities for trace dynamics imply that ensemble averages in the canonical ensemble correspond to Wightman functions in quantum field theory. Thus, quantum theory emerges as the statistical thermodynamics of trace dynamics. Finally, we argue that Brownian motion corrections to this thermodynamics lead to stochastic corrections to the Schrödinger equation, of the type that have been much studied in the "continuous spontaneous localization" model of objective state vector reduction. In appendices to the talk, we give details of the existence of a conserved operator in trace dynamics that encodes the structure of the canonical algebra, of the derivation of the Ward identities, and of the proof that the stochastically-modified Schrödinger equation leads to state vector reduction with Born rule probabilities.

  11. Towards a more accurate microscopic description of the moving contact line problem - incorporating nonlocal effects through a statistical mechanics framework

    NASA Astrophysics Data System (ADS)

    Nold, Andreas; Goddard, Ben; Sibley, David; Kalliadasis, Serafim

    2014-03-01

    Multiscale effects play a predominant role in wetting phenomena such as the moving contact line. An accurate description is of paramount interest for a wide range of industrial applications, yet it is a matter of ongoing research, due to the difficulty of incorporating different physical effects in one model. Important small-scale phenomena are corrections to the attractive fluid-fluid and wall-fluid forces in inhomogeneous density distributions, which often previously have been accounted for by the disjoining pressure in an ad-hoc manner. We systematically derive a novel model for the description of a single-component liquid-vapor multiphase system which inherently incorporates these nonlocal effects. This derivation, which is inspired by statistical mechanics in the framework of colloidal density functional theory, is critically discussed with respect to its assumptions and restrictions. The model is then employed numerically to study a moving contact line of a liquid fluid displacing its vapor phase. We show how nonlocal physical effects are inherently incorporated by the model and describe how classical macroscopic results for the contact line motion are retrieved. We acknowledge financial support from ERC Advanced Grant No. 247031 and Imperial College through a DTG International Studentship.

  12. Thrombectomy for ischemic stroke: meta-analyses of recurrent strokes, vasospasms, and subarachnoid hemorrhages.

    PubMed

    Emprechtinger, Robert; Piso, Brigitte; Ringleb, Peter A

    2017-03-01

    Mechanical thrombectomy with stent retrievers is an effective treatment for patients with ischemic stroke. Results of recent meta-analyses report that the treatment is safe. However, the endpoints recurrent stroke, vasospasms, and subarachnoid hemorrhage have not been evaluated sufficiently. Hence, we extracted data on these outcomes from the five recent thrombectomy trials (MR CLEAN, ESCAPE, REVASCAT, SWIFT PRIME, and EXTEND IA published in 2015). Subsequently, we conducted meta-analyses for each outcome. We report the results of the fixed, as well as the random effects model. Three studies reported data on recurrent strokes. While the results did not reach statistical significance in the random effects model (despite a three times elevated risk), the fixed effects model revealed a significantly higher rate of recurrent strokes after thrombectomy. Four studies reported data on subarachnoid hemorrhage. The higher pooled rates in the intervention groups were statistically significant in both, the fixed and the random effects model. One study reported on vasospasms. We recorded 14 events in the intervention group and none in the control group. The efficacy of mechanical thrombectomy is not questioned, yet our results indicate an increased risk for recurrent strokes, subarachnoid hemorrhage, and vasospasms post-treatment. Therefore, we strongly recommend a thoroughly surveillance, concerning these adverse events in future clinical trials and routine registries.

  13. Noisy coupled logistic maps in the vicinity of chaos threshold.

    PubMed

    Tirnakli, Ugur; Tsallis, Constantino

    2016-04-01

    We focus on a linear chain of N first-neighbor-coupled logistic maps in the vicinity of their edge of chaos in the presence of a common noise. This model, characterised by the coupling strength ϵ and the noise width σmax, was recently introduced by Pluchino et al. [Phys. Rev. E 87, 022910 (2013)]. They detected, for the time averaged returns with characteristic return time τ, possible connections with q-Gaussians, the distributions which optimise, under appropriate constraints, the nonadditive entropy, Sq, basis of nonextensive statistics mechanics. Here, we take a closer look on this model, and numerically obtain probability distributions which exhibit a slight asymmetry for some parameter values, in variance with simple q-Gaussians. Nevertheless, along many decades, the fitting with q-Gaussians turns out to be numerically very satisfactory for wide regions of the parameter values, and we illustrate how the index q evolves with (N,τ,ϵ,σmax). It is nevertheless instructive on how careful one must be in such numerical analysis. The overall work shows that physical and/or biological systems that are correctly mimicked by this model are thermostatistically related to nonextensive statistical mechanics when time-averaged relevant quantities are studied.

  14. Retrieval Capabilities of Hierarchical Networks: From Dyson to Hopfield

    NASA Astrophysics Data System (ADS)

    Agliari, Elena; Barra, Adriano; Galluzzi, Andrea; Guerra, Francesco; Tantari, Daniele; Tavani, Flavia

    2015-01-01

    We consider statistical-mechanics models for spin systems built on hierarchical structures, which provide a simple example of non-mean-field framework. We show that the coupling decay with spin distance can give rise to peculiar features and phase diagrams much richer than their mean-field counterpart. In particular, we consider the Dyson model, mimicking ferromagnetism in lattices, and we prove the existence of a number of metastabilities, beyond the ordered state, which become stable in the thermodynamic limit. Such a feature is retained when the hierarchical structure is coupled with the Hebb rule for learning, hence mimicking the modular architecture of neurons, and gives rise to an associative network able to perform single pattern retrieval as well as multiple-pattern retrieval, depending crucially on the external stimuli and on the rate of interaction decay with distance; however, those emergent multitasking features reduce the network capacity with respect to the mean-field counterpart. The analysis is accomplished through statistical mechanics, Markov chain theory, signal-to-noise ratio technique, and numerical simulations in full consistency. Our results shed light on the biological complexity shown by real networks, and suggest future directions for understanding more realistic models.

  15. Noisy coupled logistic maps in the vicinity of chaos threshold

    NASA Astrophysics Data System (ADS)

    Tirnakli, Ugur; Tsallis, Constantino

    2016-04-01

    We focus on a linear chain of N first-neighbor-coupled logistic maps in the vicinity of their edge of chaos in the presence of a common noise. This model, characterised by the coupling strength ɛ and the noise width σmax, was recently introduced by Pluchino et al. [Phys. Rev. E 87, 022910 (2013)]. They detected, for the time averaged returns with characteristic return time τ, possible connections with q-Gaussians, the distributions which optimise, under appropriate constraints, the nonadditive entropy, Sq, basis of nonextensive statistics mechanics. Here, we take a closer look on this model, and numerically obtain probability distributions which exhibit a slight asymmetry for some parameter values, in variance with simple q-Gaussians. Nevertheless, along many decades, the fitting with q-Gaussians turns out to be numerically very satisfactory for wide regions of the parameter values, and we illustrate how the index q evolves with ( N , τ , ɛ , σ m a x ) . It is nevertheless instructive on how careful one must be in such numerical analysis. The overall work shows that physical and/or biological systems that are correctly mimicked by this model are thermostatistically related to nonextensive statistical mechanics when time-averaged relevant quantities are studied.

  16. Generalized Models for Rock Joint Surface Shapes

    PubMed Central

    Du, Shigui; Hu, Yunjin; Hu, Xiaofei

    2014-01-01

    Generalized models of joint surface shapes are the foundation for mechanism studies on the mechanical effects of rock joint surface shapes. Based on extensive field investigations of rock joint surface shapes, generalized models for three level shapes named macroscopic outline, surface undulating shape, and microcosmic roughness were established through statistical analyses of 20,078 rock joint surface profiles. The relative amplitude of profile curves was used as a borderline for the division of different level shapes. The study results show that the macroscopic outline has three basic features such as planar, arc-shaped, and stepped; the surface undulating shape has three basic features such as planar, undulating, and stepped; and the microcosmic roughness has two basic features such as smooth and rough. PMID:25152901

  17. Torsion of DNA modeled as a heterogeneous fluctuating rod

    NASA Astrophysics Data System (ADS)

    Argudo, David; Purohit, Prashant K.

    2014-01-01

    We discuss the statistical mechanics of a heterogeneous elastic rod with bending, twisting and stretching. Our model goes beyond earlier works where only homogeneous rods were considered in the limit of high forces and long lengths. Our methods allow us to consider shorter fluctuating rods for which boundary conditions can play an important role. We use our theory to study structural transitions in torsionally constrained DNA where there is coexistence of states with different effective properties. In particular, we examine whether a newly discovered left-handed DNA conformation called L-DNA is a mixture of two known states. We also use our model to investigate the mechanical effects of the binding of small molecules to DNA. For both these applications we make experimentally falsifiable predictions.

  18. Shear Band Formation in Plastic-Bonded Explosives (PBX)

    NASA Astrophysics Data System (ADS)

    Dey, Thomas N.; Johnson, James N.

    1997-07-01

    Adiabatic shear bands can be a source of ignition and lead to detonation. At low to moderate deformation rates, 10--1000 s-1, two other mechanisms can also give rise to shear bands. These mechanisms are: softening caused by micro-cracking and (2) a constitutive response with a non-associated flow rule as is observed in granular material such as soil. Brittle behavior at small strains and the granular nature of HMX suggest that PBX-9501 constitutive behavior may be similar to sand. A constitutive model for each of these mechanims is studied in a series of calculations. A viscoelastic constitutive model for PBX-9501 softens via a statistical crack model, based on the work of Dienes (1986). A sand model is used to provide a non-associated flow rule. Both models generate shear band formation at 1--2% strain at nominal strain rates at and below 1000 s-1. Shear band formation is suppressed at higher strain rates. The sand model gives qualitative agreement for location and orientation of shear bands observed in a punch experiment. Both mechanisms may accelerate the formation of adiabatic shear bands.

  19. Occupational Injury and Illness Surveillance: Conceptual Filters Explain Underreporting

    PubMed Central

    Azaroff, Lenore S.; Levenstein, Charles; Wegman, David H.

    2002-01-01

    Occupational health surveillance data are key to effective intervention. However, the US Bureau of Labor Statistics survey significantly underestimates the incidence of work-related injuries and illnesses. Researchers supplement these statistics with data from other systems not designed for surveillance. The authors apply the filter model of Webb et al. to underreporting by the Bureau of Labor Statistics, workers’ compensation wage-replacement documents, physician reporting systems, and medical records of treatment charged to workers’ compensation. Mechanisms are described for the loss of cases at successive steps of documentation. Empirical findings indicate that workers repeatedly risk adverse consequences for attempting to complete these steps, while systems for ensuring their completion are weak or absent. PMID:12197968

  20. Statistical Mechanics of the US Supreme Court

    NASA Astrophysics Data System (ADS)

    Lee, Edward D.; Broedersz, Chase P.; Bialek, William

    2015-07-01

    We build simple models for the distribution of voting patterns in a group, using the Supreme Court of the United States as an example. The maximum entropy model consistent with the observed pairwise correlations among justices' votes, an Ising spin glass, agrees quantitatively with the data. While all correlations (perhaps surprisingly) are positive, the effective pairwise interactions in the spin glass model have both signs, recovering the intuition that ideologically opposite justices negatively influence each another. Despite the competing interactions, a strong tendency toward unanimity emerges from the model, organizing the voting patterns in a relatively simple "energy landscape." Besides unanimity, other energy minima in this landscape, or maxima in probability, correspond to prototypical voting states, such as the ideological split or a tightly correlated, conservative core. The model correctly predicts the correlation of justices with the majority and gives us a measure of their influence on the majority decision. These results suggest that simple models, grounded in statistical physics, can capture essential features of collective decision making quantitatively, even in a complex political context.

  1. Universal algorithm for identification of fractional Brownian motion. A case of telomere subdiffusion.

    PubMed

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-11-07

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic--mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Entropic Repulsion Between Fluctuating Surfaces

    NASA Astrophysics Data System (ADS)

    Janke, W.

    The statistical mechanics of fluctuating surfaces plays an important role in a variety of physical systems, ranging from biological membranes to world sheets of strings in theories of fundamental interactions. In many applications it is a good approximation to assume that the surfaces possess no tension. Their statistical properties are then governed by curvature energies only, which allow for gigantic out-of-plane undulations. These fluctuations are the “entropic” origin of long-range repulsive forces in layered surface systems. Theoretical estimates of these forces for simple model surfaces are surveyed and compared with recent Monte Carlo simulations.

  3. Path statistics, memory, and coarse-graining of continuous-time random walks on networks

    PubMed Central

    Kion-Crosby, Willow; Morozov, Alexandre V.

    2015-01-01

    Continuous-time random walks (CTRWs) on discrete state spaces, ranging from regular lattices to complex networks, are ubiquitous across physics, chemistry, and biology. Models with coarse-grained states (for example, those employed in studies of molecular kinetics) or spatial disorder can give rise to memory and non-exponential distributions of waiting times and first-passage statistics. However, existing methods for analyzing CTRWs on complex energy landscapes do not address these effects. Here we use statistical mechanics of the nonequilibrium path ensemble to characterize first-passage CTRWs on networks with arbitrary connectivity, energy landscape, and waiting time distributions. Our approach can be applied to calculating higher moments (beyond the mean) of path length, time, and action, as well as statistics of any conservative or non-conservative force along a path. For homogeneous networks, we derive exact relations between length and time moments, quantifying the validity of approximating a continuous-time process with its discrete-time projection. For more general models, we obtain recursion relations, reminiscent of transfer matrix and exact enumeration techniques, to efficiently calculate path statistics numerically. We have implemented our algorithm in PathMAN (Path Matrix Algorithm for Networks), a Python script that users can apply to their model of choice. We demonstrate the algorithm on a few representative examples which underscore the importance of non-exponential distributions, memory, and coarse-graining in CTRWs. PMID:26646868

  4. Beyond δ: Tailoring marked statistics to reveal modified gravity

    NASA Astrophysics Data System (ADS)

    Valogiannis, Georgios; Bean, Rachel

    2018-01-01

    Models which attempt to explain the accelerated expansion of the universe through large-scale modifications to General Relativity (GR), must satisfy the stringent experimental constraints of GR in the solar system. Viable candidates invoke a “screening” mechanism, that dynamically suppresses deviations in high density environments, making their overall detection challenging even for ambitious future large-scale structure surveys. We present methods to efficiently simulate the non-linear properties of such theories, and consider how a series of statistics that reweight the density field to accentuate deviations from GR can be applied to enhance the overall signal-to-noise ratio in differentiating the models from GR. Our results demonstrate that the cosmic density field can yield additional, invaluable cosmological information, beyond the simple density power spectrum, that will enable surveys to more confidently discriminate between modified gravity models and ΛCDM.

  5. Scenarios for Evolving Seismic Crises: Possible Communication Strategies

    NASA Astrophysics Data System (ADS)

    Steacy, S.

    2015-12-01

    Recent advances in operational earthquake forecasting mean that we are very close to being able to confidently compute changes in earthquake probability as seismic crises develop. For instance, we now have statistical models such as ETAS and STEP which demonstrate considerable skill in forecasting earthquake rates and recent advances in Coulomb based models are also showing much promise. Communicating changes in earthquake probability is likely be very difficult, however, as the absolute probability of a damaging event is likely to remain quite small despite a significant increase in the relative value. Here, we use a hybrid Coulomb/statistical model to compute probability changes for a series of earthquake scenarios in New Zealand. We discuss the strengths and limitations of the forecasts and suggest a number of possible mechanisms that might be used to communicate results in an actual developing seismic crisis.

  6. Distinguishing synchronous and time-varying synergies using point process interval statistics: motor primitives in frog and rat

    PubMed Central

    Hart, Corey B.; Giszter, Simon F.

    2013-01-01

    We present and apply a method that uses point process statistics to discriminate the forms of synergies in motor pattern data, prior to explicit synergy extraction. The method uses electromyogram (EMG) pulse peak timing or onset timing. Peak timing is preferable in complex patterns where pulse onsets may be overlapping. An interval statistic derived from the point processes of EMG peak timings distinguishes time-varying synergies from synchronous synergies (SS). Model data shows that the statistic is robust for most conditions. Its application to both frog hindlimb EMG and rat locomotion hindlimb EMG show data from these preparations is clearly most consistent with synchronous synergy models (p < 0.001). Additional direct tests of pulse and interval relations in frog data further bolster the support for synchronous synergy mechanisms in these data. Our method and analyses support separated control of rhythm and pattern of motor primitives, with the low level execution primitives comprising pulsed SS in both frog and rat, and both episodic and rhythmic behaviors. PMID:23675341

  7. Annual West Coast Theoretical Chemistry/Statistical Mechanics Conference (14th), held at Los Angeles, California on 17-19 June 1993

    DTIC Science & Technology

    1993-06-19

    of California, Los Angles, CA AB-INITIO STUDIES OF WATER CLUSTERSt Sotris S. Xantheas and Thorn H. Dunning Jr. Molecular Theory Group, Molecular...mechanics, of the solvation properties of a chlorine ion in polarizable water . In these studies , we employed the polarizable water model developed recently by...acetaldehyde, hydrogen sulfide with 2-hydroxy-3-methylbutan-2-yl, and lithium hydride with methyl isopropyl ketone . The largest system studied . 1. contains

  8. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  9. Drainage characteristics of the 3F MicroStent using a novel film occlusion anchoring mechanism.

    PubMed

    Lange, Dirk; Hoag, Nathan A; Poh, Beow Kiong; Chew, Ben H

    2011-06-01

    To determine whether the overall ureteral flow through an obstructed ureter using the 3F MicroStent™ that uses a novel film occlusion anchoring mechanism is comparable to the flow using a conventional 3F and 4.7F Double-J stent. An in vitro silicone ureter model and an ex vivo porcine urinary model (kidney and ureter) were used to measure the overall flow through obstructed and unobstructed ureters with either a 3F Double-J stent (Cook), 3F MicroStent (PercSys), or 4.7F Double-J stent (Cook). Mean flow rates were compared with descriptive statistics. Mean flow rates through the obstructed silicone ureter (12-mm stone) for the 3F MicroStent, 3F Double-J stent, and 4.7F Double-J stent were 326.7±13.3  mL/min, 283.3±19.2  mL/min, and 356.7±14.1  mL/min, respectively. In the obstructed ex vivo porcine ureter model, the flow as a percentage of free flow was 60%, 53%, and 50 %, respectively. In both ureteral models, flow rates of the 3F MicroStent and 4.7F Double-J stents were not statistically different. The 3F MicroStent demonstrated drainage equivalent to a 4.7F Double-J stent, in both in vitro silicone and ex vivo porcine obstructed urinary models. We have demonstrated the crucial first step that this 3F stent, using a novel film occlusion anchoring mechanism, has equivalent, if not slightly improved, drainage rates when compared with its larger counterpart.

  10. The role of internal duplication in the evolution of multi-domain proteins.

    PubMed

    Nacher, J C; Hayashida, M; Akutsu, T

    2010-08-01

    Many proteins consist of several structural domains. These multi-domain proteins have likely been generated by selective genome growth dynamics during evolution to perform new functions as well as to create structures that fold on a biologically feasible time scale. Domain units frequently evolved through a variety of genetic shuffling mechanisms. Here we examine the protein domain statistics of more than 1000 organisms including eukaryotic, archaeal and bacterial species. The analysis extends earlier findings on asymmetric statistical laws for proteome to a wider variety of species. While proteins are composed of a wide range of domains, displaying a power-law decay, the computation of domain families for each protein reveals an exponential distribution, characterizing a protein universe composed of a thin number of unique families. Structural studies in proteomics have shown that domain repeats, or internal duplicated domains, represent a small but significant fraction of genome. In spite of its importance, this observation has been largely overlooked until recently. We model the evolutionary dynamics of proteome and demonstrate that these distinct distributions are in fact rooted in an internal duplication mechanism. This process generates the contemporary protein structural domain universe, determines its reduced thickness, and tames its growth. These findings have important implications, ranging from protein interaction network modeling to evolutionary studies based on fundamental mechanisms governing genome expansion.

  11. Epidemics in Ming and Qing China: Impacts of changes of climate and economic well-being.

    PubMed

    Pei, Qing; Zhang, David D; Li, Guodong; Winterhalder, Bruce; Lee, Harry F

    2015-07-01

    We investigated the mechanism of epidemics with the impacts of climate change and socio-economic fluctuations in the Ming and Qing Dynasties in China (AD 1368-1901). Using long-term and high-quality datasets, this study is the first quantitative research that verifies the 'climate change → economy → epidemics' mechanism in historical China by statistical methods that include correlation analysis, Granger causality analysis, ARX, and Poisson-ARX modeling. The analysis provides the evidences that climate change could only fundamentally lead to the epidemics spread and occurrence, but the depressed economic well-being is the direct trigger of epidemics spread and occurrence at the national and long term scale in historical China. Moreover, statistical modeling shows that economic well-being is more important than population pressure in the mechanism of epidemics. However, population pressure remains a key element in determining the social vulnerability of the epidemics occurrence under climate change. Notably, the findings not only support adaptation theories but also enhance our confidence to address climatic shocks if economic buffering capacity can be promoted steadily. The findings can be a basis for scientists and policymakers in addressing global and regional environmental changes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Cooperativity and modularity in protein folding

    PubMed Central

    Sasai, Masaki; Chikenji, George; Terada, Tomoki P.

    2016-01-01

    A simple statistical mechanical model proposed by Wako and Saitô has explained the aspects of protein folding surprisingly well. This model was systematically applied to multiple proteins by Muñoz and Eaton and has since been referred to as the Wako-Saitô-Muñoz-Eaton (WSME) model. The success of the WSME model in explaining the folding of many proteins has verified the hypothesis that the folding is dominated by native interactions, which makes the energy landscape globally biased toward native conformation. Using the WSME and other related models, Saitô emphasized the importance of the hierarchical pathway in protein folding; folding starts with the creation of contiguous segments having a native-like configuration and proceeds as growth and coalescence of these segments. The Φ-values calculated for barnase with the WSME model suggested that segments contributing to the folding nucleus are similar to the structural modules defined by the pattern of native atomic contacts. The WSME model was extended to explain folding of multi-domain proteins having a complex topology, which opened the way to comprehensively understanding the folding process of multi-domain proteins. The WSME model was also extended to describe allosteric transitions, indicating that the allosteric structural movement does not occur as a deterministic sequential change between two conformations but as a stochastic diffusive motion over the dynamically changing energy landscape. Statistical mechanical viewpoint on folding, as highlighted by the WSME model, has been renovated in the context of modern methods and ideas, and will continue to provide insights on equilibrium and dynamical features of proteins. PMID:28409080

  13. Validating Lung Models Using the ASL 5000 Breathing Simulator.

    PubMed

    Dexter, Amanda; McNinch, Neil; Kaznoch, Destiny; Volsko, Teresa A

    2018-04-01

    This study sought to validate pediatric models with normal and altered pulmonary mechanics. PubMed and CINAHL databases were searched for studies directly measuring pulmonary mechanics of healthy infants and children, infants with severe bronchopulmonary dysplasia and neuromuscular disease. The ASL 5000 was used to construct models using tidal volume (VT), inspiratory time (TI), respiratory rate, resistance, compliance, and esophageal pressure gleaned from literature. Data were collected for a 1-minute period and repeated three times for each model. t tests compared modeled data with data abstracted from the literature. Repeated measures analyses evaluated model performance over multiple iterations. Statistical significance was established at a P value of less than 0.05. Maximum differences of means (experimental iteration mean - clinical standard mean) for TI and VT are the following: term infant without lung disease (TI = 0.09 s, VT = 0.29 mL), severe bronchopulmonary dysplasia (TI = 0.08 s, VT = 0.17 mL), child without lung disease (TI = 0.10 s, VT = 0.17 mL), and child with neuromuscular disease (TI = 0.09 s, VT = 0.57 mL). One-sample testing demonstrated statistically significant differences between clinical controls and VT and TI values produced by the ASL 5000 for each iteration and model (P < 0.01). The greatest magnitude of differences was negligible (VT < 1.6%, TI = 18%) and not clinically relevant. Inconsistencies occurred with the models constructed on the ASL 5000. It was deemed accurate for the study purposes. It is therefore essential to test models and evaluate magnitude of differences before use.

  14. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  15. Statistical steady states in turbulent droplet condensation

    NASA Astrophysics Data System (ADS)

    Bec, Jeremie; Krstulovic, Giorgio; Siewert, Christoph

    2017-11-01

    We investigate the general problem of turbulent condensation. Using direct numerical simulations we show that the fluctuations of the supersaturation field offer different conditions for the growth of droplets which evolve in time due to turbulent transport and mixing. This leads to propose a Lagrangian stochastic model consisting of a set of integro-differential equations for the joint evolution of the squared radius and the supersaturation along droplet trajectories. The model has two parameters fixed by the total amount of water and the thermodynamic properties, as well as the Lagrangian integral timescale of the turbulent supersaturation. The model reproduces very well the droplet size distributions obtained from direct numerical simulations and their time evolution. A noticeable result is that, after a stage where the squared radius simply diffuses, the system converges exponentially fast to a statistical steady state independent of the initial conditions. The main mechanism involved in this convergence is a loss of memory induced by a significant number of droplets undergoing a complete evaporation before growing again. The statistical steady state is characterised by an exponential tail in the droplet mass distribution.

  16. Towards a statistical mechanical theory of active fluids.

    PubMed

    Marini Bettolo Marconi, Umberto; Maggi, Claudio

    2015-12-07

    We present a stochastic description of a model of N mutually interacting active particles in the presence of external fields and characterize its steady state behavior in the absence of currents. To reproduce the effects of the experimentally observed persistence of the trajectories of the active particles we consider a Gaussian force having a non-vanishing correlation time τ, whose finiteness is a measure of the activity of the system. With these ingredients we show that it is possible to develop a statistical mechanical approach similar to the one employed in the study of equilibrium liquids and to obtain the explicit form of the many-particle distribution function by means of the multidimensional unified colored noise approximation. Such a distribution plays a role analogous to the Gibbs distribution in equilibrium statistical mechanics and provides complete information about the microscopic state of the system. From here we develop a method to determine the one- and two-particle distribution functions in the spirit of the Born-Green-Yvon (BGY) equations of equilibrium statistical mechanics. The resulting equations which contain extra-correlations induced by the activity allow us to determine the stationary density profiles in the presence of external fields, the pair correlations and the pressure of active fluids. In the low density regime we obtained the effective pair potential ϕ(r) acting between two isolated particles separated by a distance, r, showing the existence of an effective attraction between them induced by activity. Based on these results, in the second half of the paper we propose a mean field theory as an approach simpler than the BGY hierarchy and use it to derive a van der Waals expression of the equation of state.

  17. Numerical model for thermodynamical behaviors of unsaturated soil

    NASA Astrophysics Data System (ADS)

    Miyamoto, Yuji; Yamada, Mitsuhide; Sako, Kazunari; Araki, Kohei; Kitamura, Ryosuke

    Kitamura et al. have proposed the numerical models to establish the unsaturated soil mechanics aided by probability theory and statistics, and to apply the unsaturated soil mechanics to the geo-simulator, where the numerical model for the thermodynamical behaviors of unsaturated soil are essential. In this paper the thermodynamics is introduced to investigate the heat transfer through unsaturated soil and the evaporation of pore water in soil based on the first and second laws of thermodynamics, i.e., the conservation of energy, and increasing entropy. On the other hand the lysimeter equipment is used to obtain the data for the evaporation of pore water during fine days and seepage of rain water during rainy days. The numerical simulation is carried out by using the proposed numerical model and the results are compared with those obtained from the lysimeter test.

  18. HINDERED DIFFUSION OF COAL LIQUIDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theodore T. Tsotsis; Muhammad Sahimi; Ian A. Webster

    1996-01-01

    It was the purpose of the project described here to carry out careful and detailed investigations of petroleum and coal asphaltene transport through model porous systems under a broad range of temperature conditions. The experimental studies were to be coupled with detailed, in-depth statistical and molecular dynamics models intended to provide a fundamental understanding of the overall transport mechanisms and a more accurate concept of the asphaltene structure. The following discussion describes some of our accomplishments.

  19. Internet Addiction and Psychosocial Maladjustment: Avoidant Coping and Coping Inflexibility as Psychological Mechanisms.

    PubMed

    Cheng, Cecilia; Sun, Peizhen; Mak, Kwok-Kei

    2015-09-01

    This 6 month prospective study systematically tested some multivariate models that advanced the understanding of the psychological mechanisms underlying Internet addiction and psychosocial maladjustment. On the basis of previous theories, avoidant coping and coping inflexibility were proposed as underlying mechanisms. Participants were 271 Chinese undergraduates (75% women, Mage=20.49) who took part in both phases of this study. Structural equation modeling was performed to obtain the best fit models for both the cross-sectional and the prospective data. The cross-sectional model testing revealed statistically significant mediating effects for both avoidant coping (β=0.149 [95% CI 0.071-0.226], p=0.002) and coping flexibility (β=0.048 [95% CI 0.013-0.081], p=0.032). The prospective model testing further showed that avoidant coping mediated the relationship between Internet addiction and Time 2 psychosocial maladjustment (β=0.141 [95% CI 0.065-0.216], p=0.005), as well as that between coping flexibility and Time 2 psychosocial maladjustment (β=-0.096 [95% CI -0.161 to -0.031], p=0.015). This study was the first to establish theory-driven models, which unveiled an inflexible, avoidant coping style as psychological mechanisms that explained the link between Internet addiction and psychosocial maladjustment.

  20. An application of statistical mechanics for representing equilibrium perimeter distributions of tropical convective clouds

    NASA Astrophysics Data System (ADS)

    Garrett, T. J.; Alva, S.; Glenn, I. B.; Krueger, S. K.

    2015-12-01

    There are two possible approaches for parameterizing sub-grid cloud dynamics in a coarser grid model. The most common is to use a fine scale model to explicitly resolve the mechanistic details of clouds to the best extent possible, and then to parameterize these behaviors cloud state for the coarser grid. A second is to invoke physical intuition and some very general theoretical principles from equilibrium statistical mechanics. This approach avoids any requirement to resolve time-dependent processes in order to arrive at a suitable solution. The second approach is widely used elsewhere in the atmospheric sciences: for example the Planck function for blackbody radiation is derived this way, where no mention is made of the complexities of modeling a large ensemble of time-dependent radiation-dipole interactions in order to obtain the "grid-scale" spectrum of thermal emission by the blackbody as a whole. We find that this statistical approach may be equally suitable for modeling convective clouds. Specifically, we make the physical argument that the dissipation of buoyant energy in convective clouds is done through mixing across a cloud perimeter. From thermodynamic reasoning, one might then anticipate that vertically stacked isentropic surfaces are characterized by a power law dlnN/dlnP = -1, where N(P) is the number clouds of perimeter P. In a Giga-LES simulation of convective clouds within a 100 km square domain we find that such a power law does appear to characterize simulated cloud perimeters along isentropes, provided a sufficient cloudy sample. The suggestion is that it may be possible to parameterize certain important aspects of cloud state without appealing to computationally expensive dynamic simulations.

  1. Statistical use of argonaute expression and RISC assembly in microRNA target identification.

    PubMed

    Stanhope, Stephen A; Sengupta, Srikumar; den Boon, Johan; Ahlquist, Paul; Newton, Michael A

    2009-09-01

    MicroRNAs (miRNAs) posttranscriptionally regulate targeted messenger RNAs (mRNAs) by inducing cleavage or otherwise repressing their translation. We address the problem of detecting m/miRNA targeting relationships in homo sapiens from microarray data by developing statistical models that are motivated by the biological mechanisms used by miRNAs. The focus of our modeling is the construction, activity, and mediation of RNA-induced silencing complexes (RISCs) competent for targeted mRNA cleavage. We demonstrate that regression models accommodating RISC abundance and controlling for other mediating factors fit the expression profiles of known target pairs substantially better than models based on m/miRNA expressions alone, and lead to verifications of computational target pair predictions that are more sensitive than those based on marginal expression levels. Because our models are fully independent of exogenous results from sequence-based computational methods, they are appropriate for use as either a primary or secondary source of information regarding m/miRNA target pair relationships, especially in conjunction with high-throughput expression studies.

  2. A marked correlation function for constraining modified gravity models

    NASA Astrophysics Data System (ADS)

    White, Martin

    2016-11-01

    Future large scale structure surveys will provide increasingly tight constraints on our cosmological model. These surveys will report results on the distance scale and growth rate of perturbations through measurements of Baryon Acoustic Oscillations and Redshift-Space Distortions. It is interesting to ask: what further analyses should become routine, so as to test as-yet-unknown models of cosmic acceleration? Models which aim to explain the accelerated expansion rate of the Universe by modifications to General Relativity often invoke screening mechanisms which can imprint a non-standard density dependence on their predictions. This suggests density-dependent clustering as a `generic' constraint. This paper argues that a density-marked correlation function provides a density-dependent statistic which is easy to compute and report and requires minimal additional infrastructure beyond what is routinely available to such survey analyses. We give one realization of this idea and study it using low order perturbation theory. We encourage groups developing modified gravity theories to see whether such statistics provide discriminatory power for their models.

  3. One-dimensional turbulence modeling of a turbulent counterflow flame with comparison to DNS

    DOE PAGES

    Jozefik, Zoltan; Kerstein, Alan R.; Schmidt, Heiko; ...

    2015-06-01

    The one-dimensional turbulence (ODT) model is applied to a reactant-to-product counterflow configuration and results are compared with DNS data. The model employed herein solves conservation equations for momentum, energy, and species on a one dimensional (1D) domain corresponding to the line spanning the domain between nozzle orifice centers. The effects of turbulent mixing are modeled via a stochastic process, while the Kolmogorov and reactive length and time scales are explicitly resolved and a detailed chemical kinetic mechanism is used. Comparisons between model and DNS results for spatial mean and root-mean-square (RMS) velocity, temperature, and major and minor species profiles aremore » shown. The ODT approach shows qualitatively and quantitatively reasonable agreement with the DNS data. Scatter plots and statistics conditioned on temperature are also compared for heat release rate and all species. ODT is able to capture the range of results depicted by DNS. As a result, conditional statistics show signs of underignition.« less

  4. In vivo serial MRI-based models and statistical methods to quantify sensitivity and specificity of mechanical predictors for carotid plaque rupture: location and beyond.

    PubMed

    Wu, Zheyang; Yang, Chun; Tang, Dalin

    2011-06-01

    It has been hypothesized that mechanical risk factors may be used to predict future atherosclerotic plaque rupture. Truly predictive methods for plaque rupture and methods to identify the best predictor(s) from all the candidates are lacking in the literature. A novel combination of computational and statistical models based on serial magnetic resonance imaging (MRI) was introduced to quantify sensitivity and specificity of mechanical predictors to identify the best candidate for plaque rupture site prediction. Serial in vivo MRI data of carotid plaque from one patient was acquired with follow-up scan showing ulceration. 3D computational fluid-structure interaction (FSI) models using both baseline and follow-up data were constructed and plaque wall stress (PWS) and strain (PWSn) and flow maximum shear stress (FSS) were extracted from all 600 matched nodal points (100 points per matched slice, baseline matching follow-up) on the lumen surface for analysis. Each of the 600 points was marked "ulcer" or "nonulcer" using follow-up scan. Predictive statistical models for each of the seven combinations of PWS, PWSn, and FSS were trained using the follow-up data and applied to the baseline data to assess their sensitivity and specificity using the 600 data points for ulcer predictions. Sensitivity of prediction is defined as the proportion of the true positive outcomes that are predicted to be positive. Specificity of prediction is defined as the proportion of the true negative outcomes that are correctly predicted to be negative. Using probability 0.3 as a threshold to infer ulcer occurrence at the prediction stage, the combination of PWS and PWSn provided the best predictive accuracy with (sensitivity, specificity) = (0.97, 0.958). Sensitivity and specificity given by PWS, PWSn, and FSS individually were (0.788, 0.968), (0.515, 0.968), and (0.758, 0.928), respectively. The proposed computational-statistical process provides a novel method and a framework to assess the sensitivity and specificity of various risk indicators and offers the potential to identify the optimized predictor for plaque rupture using serial MRI with follow-up scan showing ulceration as the gold standard for method validation. While serial MRI data with actual rupture are hard to acquire, this single-case study suggests that combination of multiple predictors may provide potential improvement to existing plaque assessment schemes. With large-scale patient studies, this predictive modeling process may provide more solid ground for rupture predictor selection strategies and methods for image-based plaque vulnerability assessment.

  5. Boosting Bayesian parameter inference of stochastic differential equation models with methods from statistical physics

    NASA Astrophysics Data System (ADS)

    Albert, Carlo; Ulzega, Simone; Stoop, Ruedi

    2016-04-01

    Measured time-series of both precipitation and runoff are known to exhibit highly non-trivial statistical properties. For making reliable probabilistic predictions in hydrology, it is therefore desirable to have stochastic models with output distributions that share these properties. When parameters of such models have to be inferred from data, we also need to quantify the associated parametric uncertainty. For non-trivial stochastic models, however, this latter step is typically very demanding, both conceptually and numerically, and always never done in hydrology. Here, we demonstrate that methods developed in statistical physics make a large class of stochastic differential equation (SDE) models amenable to a full-fledged Bayesian parameter inference. For concreteness we demonstrate these methods by means of a simple yet non-trivial toy SDE model. We consider a natural catchment that can be described by a linear reservoir, at the scale of observation. All the neglected processes are assumed to happen at much shorter time-scales and are therefore modeled with a Gaussian white noise term, the standard deviation of which is assumed to scale linearly with the system state (water volume in the catchment). Even for constant input, the outputs of this simple non-linear SDE model show a wealth of desirable statistical properties, such as fat-tailed distributions and long-range correlations. Standard algorithms for Bayesian inference fail, for models of this kind, because their likelihood functions are extremely high-dimensional intractable integrals over all possible model realizations. The use of Kalman filters is illegitimate due to the non-linearity of the model. Particle filters could be used but become increasingly inefficient with growing number of data points. Hamiltonian Monte Carlo algorithms allow us to translate this inference problem to the problem of simulating the dynamics of a statistical mechanics system and give us access to most sophisticated methods that have been developed in the statistical physics community over the last few decades. We demonstrate that such methods, along with automated differentiation algorithms, allow us to perform a full-fledged Bayesian inference, for a large class of SDE models, in a highly efficient and largely automatized manner. Furthermore, our algorithm is highly parallelizable. For our toy model, discretized with a few hundred points, a full Bayesian inference can be performed in a matter of seconds on a standard PC.

  6. Stress transmission through a model system of cohesionless elastic grains

    NASA Astrophysics Data System (ADS)

    Da Silva, Miguel; Rajchenbach, Jean

    2000-08-01

    Understanding the mechanical properties of granular materials is important for applications in civil and chemical engineering, geophysical sciences and the food industry, as well as for the control or prevention of avalanches and landslides. Unlike continuous media, granular materials lack cohesion, and cannot resist tensile stresses. Current descriptions of the mechanical properties of collections of cohesionless grains have relied either on elasto-plastic models classically used in civil engineering, or on a recent model involving hyperbolic equations. The former models suggest that collections of elastic grains submitted to a compressive load will behave elastically. Here we present the results of an experiment on a two-dimensional model system-made of discrete square cells submitted to a point load-in which the region in which the stress is confined is photoelastically visualized as a parabola. These results, which can be interpreted within a statistical framework, demonstrate that the collective response of the pile contradicts the standard elastic predictions and supports a diffusive description of stress transmission. We expect that these findings will be applicable to problems in soil mechanics, such as the behaviour of cohesionless soils or sand piles.

  7. InGaAs tunnel diodes for the calibration of semi-classical and quantum mechanical band-to-band tunneling models

    NASA Astrophysics Data System (ADS)

    Smets, Quentin; Verreck, Devin; Verhulst, Anne S.; Rooyackers, Rita; Merckling, Clément; Van De Put, Maarten; Simoen, Eddy; Vandervorst, Wilfried; Collaert, Nadine; Thean, Voon Y.; Sorée, Bart; Groeseneken, Guido; Heyns, Marc M.

    2014-05-01

    Promising predictions are made for III-V tunnel-field-effect transistor (FET), but there is still uncertainty on the parameters used in the band-to-band tunneling models. Therefore, two simulators are calibrated in this paper; the first one uses a semi-classical tunneling model based on Kane's formalism, and the second one is a quantum mechanical simulator implemented with an envelope function formalism. The calibration is done for In0.53Ga0.47As using several p+/intrinsic/n+ diodes with different intrinsic region thicknesses. The dopant profile is determined by SIMS and capacitance-voltage measurements. Error bars are used based on statistical and systematic uncertainties in the measurement techniques. The obtained parameters are in close agreement with theoretically predicted values and validate the semi-classical and quantum mechanical models. Finally, the models are applied to predict the input characteristics of In0.53Ga0.47As n- and p-lineTFET, with the n-lineTFET showing competitive performance compared to MOSFET.

  8. Applications of squeezed states: Bogoliubov transformations and wavelets to the statistical mechanics of water and its bubbles

    NASA Technical Reports Server (NTRS)

    Defacio, Brian; Kim, S.-H.; Vannevel, A.

    1994-01-01

    The squeezed states or Bogoliubov transformations and wavelets are applied to two problems in nonrelativistic statistical mechanics: the dielectric response of liquid water, epsilon(q-vector,w), and the bubble formation in water during insonnification. The wavelets are special phase-space windows which cover the domain and range of L(exp 1) intersection of L(exp 2) of classical causal, finite energy solutions. The multiresolution of discrete wavelets in phase space gives a decomposition into regions of time and scales of frequency thereby allowing the renormalization group to be applied to new systems in addition to the tired 'usual suspects' of the Ising models and lattice gasses. The Bogoliubov transformation: squeeze transformation is applied to the dipolaron collective mode in water and to the gas produced by the explosive cavitation process in bubble formation.

  9. A Lattice Boltzmann Method for Turbomachinery Simulations

    NASA Technical Reports Server (NTRS)

    Hsu, A. T.; Lopez, I.

    2003-01-01

    Lattice Boltzmann (LB) Method is a relatively new method for flow simulations. The start point of LB method is statistic mechanics and Boltzmann equation. The LB method tries to set up its model at molecular scale and simulate the flow at macroscopic scale. LBM has been applied to mostly incompressible flows and simple geometry.

  10. Option Pricing with a Levy-Type Stochastic Dynamic Model for Stock Price Process Under Semi-Markovian Structural Perturbations

    DTIC Science & Technology

    2015-11-30

    of interest are currently being investigated: (1) an evaluation of the effects of the backward recurrence time, the sojourn time distribution and the...Statistical Mechanics and Its Applications 407, 350–359. W. Schachermayer (2010) Fundamental theorem of asset pricing, Encyclopedia of Quanti - tative

  11. Markov Random Fields, Stochastic Quantization and Image Analysis

    DTIC Science & Technology

    1990-01-01

    Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.

  12. Dysregulated Fear in Toddlerhood Predicts Kindergarten Social Withdrawal through Protective Parenting

    ERIC Educational Resources Information Center

    Kiel, Elizabeth J.; Buss, Kristin A.

    2014-01-01

    Two recent advances in the study of fearful temperament (behavioural inhibition) include the validation of dysregulated fear as a temperamental construct that more specifically predicts later social withdrawal and anxiety, and the use of conceptual and statistical models that place parenting as a mechanism of development from temperament to these…

  13. Lagrange thermodynamic potential and intrinsic variables for He-3 He-4 dilute solutions

    NASA Technical Reports Server (NTRS)

    Jackson, H. W.

    1983-01-01

    For a two-fluid model of dilute solutions of He-3 in liquid He-4, a thermodynamic potential is constructed that provides a Lagrangian for deriving equations of motion by a variational procedure. This Lagrangian is defined for uniform velocity fields as a (negative) Legendre transform of total internal energy, and its primary independent variables, together with their thermodynamic conjugates, are identified. Here, similarities between relations in classical physics and quantum statistical mechanics serve as a guide for developing an alternate expression for this function that reveals its character as the difference between apparent kinetic energy and intrinsic internal energy. When the He-3 concentration in the mixtures tends to zero, this expression reduces to Zilsel's formula for the Lagrangian for pure liquid He-4. An investigation of properties of the intrinsic internal energy leads to the introduction of intrinsic chemical potentials along with other intrinsic variables for the mixtures. Explicit formulas for these variables are derived for a noninteracting elementary excitation model of the fluid. Using these formulas and others also derived from quantum statistical mechanics, another equivalent expression for the Lagrangian is generated.

  14. Bootstrap calculation of ultimate strength temperature maxima for neutron irradiated ferritic/martensitic steels

    NASA Astrophysics Data System (ADS)

    Obraztsov, S. M.; Konobeev, Yu. V.; Birzhevoy, G. A.; Rachkov, V. I.

    2006-12-01

    The dependence of mechanical properties of ferritic/martensitic (F/M) steels on irradiation temperature is of interest because these steels are used as structural materials for fast, fusion reactors and accelerator driven systems. Experimental data demonstrating temperature peaks in physical and mechanical properties of neutron irradiated pure iron, nickel, vanadium, and austenitic stainless steels are available in the literature. A lack of such an information for F/M steels forces one to apply a computational mathematical-statistical modeling methods. The bootstrap procedure is one of such methods that allows us to obtain the necessary statistical characteristics using only a sample of limited size. In the present work this procedure is used for modeling the frequency distribution histograms of ultimate strength temperature peaks in pure iron and Russian F/M steels EP-450 and EP-823. Results of fitting the sums of Lorentz or Gauss functions to the calculated distributions are presented. It is concluded that there are two temperature (at 360 and 390 °C) peaks of the ultimate strength in EP-450 steel and single peak at 390 °C in EP-823.

  15. Strength statistics of single crystals and metallic glasses under small stressed volumes

    DOE PAGES

    Gao, Yanfei; Bei, Hongbin

    2016-05-13

    It has been well documented that plastic deformation of crystalline and amorphous metals/alloys shows a general trend of “smaller is stronger”. The majority of the experimental and modeling studies along this line have been focused on finding and reasoning the scaling slope or exponent in the logarithmic plot of strength versus size. In contrast to this view, here we show that the universal picture should be the thermally activated nucleation mechanisms in small stressed volume, the stochastic behavior as to find the weakest links in intermediate sizes of the stressed volume, and the convolution of these two mechanisms with respectmore » to variables such as indenter radius in nanoindentation pop-in, crystallographic orientation, pre-strain level, sample length as in uniaxial tests, and others. Furthermore, experiments that cover the entire spectrum of length scales and a unified model that treats both thermal activation and spatial stochasticity have discovered new perspectives in understanding and correlating the strength statistics in a vast of observations in nanoindentation, micro-pillar compression, and fiber/whisker tension tests of single crystals and metallic glasses.« less

  16. Edwards statistical mechanics for jammed granular matter

    NASA Astrophysics Data System (ADS)

    Baule, Adrian; Morone, Flaviano; Herrmann, Hans J.; Makse, Hernán A.

    2018-01-01

    In 1989, Sir Sam Edwards made the visionary proposition to treat jammed granular materials using a volume ensemble of equiprobable jammed states in analogy to thermal equilibrium statistical mechanics, despite their inherent athermal features. Since then, the statistical mechanics approach for jammed matter—one of the very few generalizations of Gibbs-Boltzmann statistical mechanics to out-of-equilibrium matter—has garnered an extraordinary amount of attention by both theorists and experimentalists. Its importance stems from the fact that jammed states of matter are ubiquitous in nature appearing in a broad range of granular and soft materials such as colloids, emulsions, glasses, and biomatter. Indeed, despite being one of the simplest states of matter—primarily governed by the steric interactions between the constitutive particles—a theoretical understanding based on first principles has proved exceedingly challenging. Here a systematic approach to jammed matter based on the Edwards statistical mechanical ensemble is reviewed. The construction of microcanonical and canonical ensembles based on the volume function, which replaces the Hamiltonian in jammed systems, is discussed. The importance of approximation schemes at various levels is emphasized leading to quantitative predictions for ensemble averaged quantities such as packing fractions and contact force distributions. An overview of the phenomenology of jammed states and experiments, simulations, and theoretical models scrutinizing the strong assumptions underlying Edwards approach is given including recent results suggesting the validity of Edwards ergodic hypothesis for jammed states. A theoretical framework for packings whose constitutive particles range from spherical to nonspherical shapes such as dimers, polymers, ellipsoids, spherocylinders or tetrahedra, hard and soft, frictional, frictionless and adhesive, monodisperse, and polydisperse particles in any dimensions is discussed providing insight into a unifying phase diagram for all jammed matter. Furthermore, the connection between the Edwards ensemble of metastable jammed states and metastability in spin glasses is established. This highlights the fact that the packing problem can be understood as a constraint satisfaction problem for excluded volume and force and torque balance leading to a unifying framework between the Edwards ensemble of equiprobable jammed states and out-of-equilibrium spin glasses.

  17. A statistical approach to develop a detailed soot growth model using PAH characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raj, Abhijeet; Celnik, Matthew; Shirley, Raphael

    A detailed PAH growth model is developed, which is solved using a kinetic Monte Carlo algorithm. The model describes the structure and growth of planar PAH molecules, and is referred to as the kinetic Monte Carlo-aromatic site (KMC-ARS) model. A detailed PAH growth mechanism based on reactions at radical sites available in the literature, and additional reactions obtained from quantum chemistry calculations are used to model the PAH growth processes. New rates for the reactions involved in the cyclodehydrogenation process for the formation of 6-member rings on PAHs are calculated in this work based on density functional theory simulations. Themore » KMC-ARS model is validated by comparing experimentally observed ensembles on PAHs with the computed ensembles for a C{sub 2}H{sub 2} and a C{sub 6}H{sub 6} flame at different heights above the burner. The motivation for this model is the development of a detailed soot particle population balance model which describes the evolution of an ensemble of soot particles based on their PAH structure. However, at present incorporating such a detailed model into a population balance is computationally unfeasible. Therefore, a simpler model referred to as the site-counting model has been developed, which replaces the structural information of the PAH molecules by their functional groups augmented with statistical closure expressions. This closure is obtained from the KMC-ARS model, which is used to develop correlations and statistics in different flame environments which describe such PAH structural information. These correlations and statistics are implemented in the site-counting model, and results from the site-counting model and the KMC-ARS model are in good agreement. Additionally the effect of steric hindrance in large PAH structures is investigated and correlations for sites unavailable for reaction are presented. (author)« less

  18. Motoneuron membrane potentials follow a time inhomogeneous jump diffusion process.

    PubMed

    Jahn, Patrick; Berg, Rune W; Hounsgaard, Jørn; Ditlevsen, Susanne

    2011-11-01

    Stochastic leaky integrate-and-fire models are popular due to their simplicity and statistical tractability. They have been widely applied to gain understanding of the underlying mechanisms for spike timing in neurons, and have served as building blocks for more elaborate models. Especially the Ornstein-Uhlenbeck process is popular to describe the stochastic fluctuations in the membrane potential of a neuron, but also other models like the square-root model or models with a non-linear drift are sometimes applied. Data that can be described by such models have to be stationary and thus, the simple models can only be applied over short time windows. However, experimental data show varying time constants, state dependent noise, a graded firing threshold and time-inhomogeneous input. In the present study we build a jump diffusion model that incorporates these features, and introduce a firing mechanism with a state dependent intensity. In addition, we suggest statistical methods to estimate all unknown quantities and apply these to analyze turtle motoneuron membrane potentials. Finally, simulated and real data are compared and discussed. We find that a square-root diffusion describes the data much better than an Ornstein-Uhlenbeck process with constant diffusion coefficient. Further, the membrane time constant decreases with increasing depolarization, as expected from the increase in synaptic conductance. The network activity, which the neuron is exposed to, can be reasonably estimated to be a threshold version of the nerve output from the network. Moreover, the spiking characteristics are well described by a Poisson spike train with an intensity depending exponentially on the membrane potential.

  19. Co-occurrence statistics as a language-dependent cue for speech segmentation.

    PubMed

    Saksida, Amanda; Langus, Alan; Nespor, Marina

    2017-05-01

    To what extent can language acquisition be explained in terms of different associative learning mechanisms? It has been hypothesized that distributional regularities in spoken languages are strong enough to elicit statistical learning about dependencies among speech units. Distributional regularities could be a useful cue for word learning even without rich language-specific knowledge. However, it is not clear how strong and reliable the distributional cues are that humans might use to segment speech. We investigate cross-linguistic viability of different statistical learning strategies by analyzing child-directed speech corpora from nine languages and by modeling possible statistics-based speech segmentations. We show that languages vary as to which statistical segmentation strategies are most successful. The variability of the results can be partially explained by systematic differences between languages, such as rhythmical differences. The results confirm previous findings that different statistical learning strategies are successful in different languages and suggest that infants may have to primarily rely on non-statistical cues when they begin their process of speech segmentation. © 2016 John Wiley & Sons Ltd.

  20. Markov Logic Networks in the Analysis of Genetic Data

    PubMed Central

    Sakhanenko, Nikita A.

    2010-01-01

    Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249

  1. Literature review of models on tire-pavement interaction noise

    NASA Astrophysics Data System (ADS)

    Li, Tan; Burdisso, Ricardo; Sandu, Corina

    2018-04-01

    Tire-pavement interaction noise (TPIN) becomes dominant at speeds above 40 km/h for passenger vehicles and 70 km/h for trucks. Several models have been developed to describe and predict the TPIN. However, these models do not fully reveal the physical mechanisms or predict TPIN accurately. It is well known that all the models have both strengths and weaknesses, and different models fit different investigation purposes or conditions. The numerous papers that present these models are widely scattered among thousands of journals, and it is difficult to get the complete picture of the status of research in this area. This review article aims at presenting the history and current state of TPIN models systematically, making it easier to identify and distribute the key knowledge and opinions, and providing insight into the future research trend in this field. In this work, over 2000 references related to TPIN were collected, and 74 models were reviewed from nearly 200 selected references; these were categorized into deterministic models (37), statistical models (18), and hybrid models (19). The sections explaining the models are self-contained with key principles, equations, and illustrations included. The deterministic models were divided into three sub-categories: conventional physics models, finite element and boundary element models, and computational fluid dynamics models; the statistical models were divided into three sub-categories: traditional regression models, principal component analysis models, and fuzzy curve-fitting models; the hybrid models were divided into three sub-categories: tire-pavement interface models, mechanism separation models, and noise propagation models. At the end of each category of models, a summary table is presented to compare these models with the key information extracted. Readers may refer to these tables to find models of their interest. The strengths and weaknesses of the models in different categories were then analyzed. Finally, the modeling trend and future direction in this area are given.

  2. Contemporary New Zealand coefficients for the Trauma Injury Severity Score: TRISS(NZ).

    PubMed

    Schluter, Philip J; Cameron, Cate M; Davey, Tamzyn M; Civil, Ian; Orchard, Jodie; Dansey, Rangi; Hamill, James; Naylor, Helen; James, Carolyn; Dorrian, Jenny; Christey, Grant; Pollard, Cliff; McClure, Rod J

    2009-09-11

    To develop local contemporary coefficients for the Trauma Injury Severity Score in New Zealand, TRISS(NZ), and to evaluate their performance at predicting survival against the original TRISS coefficients. Retrospective cohort study of adults who sustained a serious traumatic injury, and who survived until presentation at Auckland City, Middlemore, Waikato, or North Shore Hospitals between 2002 and 2006. Coefficients were estimated using ordinary and multilevel mixed-effects logistic regression models. 1735 eligible patients were identified, 1672 (96%) injured from a blunt mechanism and 63 (4%) from a penetrating mechanism. For blunt mechanism trauma, 1250 (75%) were male and average age was 38 years (range: 15-94 years). TRISS information was available for 1565 patients of whom 204 (13%) died. Area under the Receiver Operating Characteristic (ROC) curves was 0.901 (95%CI: 0.879-0.923) for the TRISS(NZ) model and 0.890 (95% CI: 0.866-0.913) for TRISS (P<0.001). Insufficient data were available to determine coefficients for penetrating mechanism TRISS(NZ) models. Both TRISS models accurately predicted survival for blunt mechanism trauma. However, TRISS(NZ) coefficients were statistically superior to TRISS coefficients. A strong case exists for replacing TRISS coefficients in the New Zealand benchmarking software with these updated TRISS(NZ) estimates.

  3. Quantum mechanics/molecular mechanics modeling of photoelectron spectra: the carbon 1s core-electron binding energies of ethanol-water solutions.

    PubMed

    Löytynoja, T; Niskanen, J; Jänkälä, K; Vahtras, O; Rinkevicius, Z; Ågren, H

    2014-11-20

    Using ethanol-water solutions as illustration, we demonstrate the capability of the hybrid quantum mechanics/molecular mechanics (QM/MM) paradigm to simulate core photoelectron spectroscopy: the binding energies and the chemical shifts. An integrated approach with QM/MM binding energy calculations coupled to preceding molecular dynamics sampling is adopted to generate binding energies averaged over the solute-solvent configurations available at a particular temperature and pressure and thus allowing for a statistical assessment with confidence levels for the final binding energies. The results are analyzed in terms of the contributions in the molecular mechanics model-electrostatic, polarization, and van der Waals-with atom or bond granulation of the corresponding MM charge and polarizability force-fields. The role of extramolecular charge transfer screening of the core-hole and explicit hydrogen bonding is studied by extending the QM core to cover the first solvation shell. The results are compared to those obtained from pure electrostatic and polarizable continuum models. Particularly, the dependence of the carbon 1s binding energies with respect to the ethanol concentration is studied. Our results indicate that QM/MM can be used as an all-encompassing model to study photoelectron binding energies and chemical shifts in solvent environments.

  4. Interactions and triggering in a 3D rate and state asperity model

    NASA Astrophysics Data System (ADS)

    Dublanchet, P.; Bernard, P.

    2012-12-01

    Precise relocation of micro-seismicity and careful analysis of seismic source parameters have progressively imposed the concept of seismic asperities embedded in a creeping fault segment as being one of the most important aspect that should appear in a realistic representation of micro-seismic sources. Another important issue concerning micro-seismic activity is the existence of robust empirical laws describing the temporal and magnitude distribution of earthquakes, such as the Omori law, the distribution of inter-event time and the Gutenberg-Richter law. In this framework, this study aims at understanding statistical properties of earthquakes, by generating synthetic catalogs with a 3D, quasi-dynamic continuous rate and state asperity model, that takes into account a realistic geometry of asperities. Our approach contrasts with ETAS models (Kagan and Knopoff, 1981) usually implemented to produce earthquake catalogs, in the sense that the non linearity observed in rock friction experiments (Dieterich, 1979) is fully taken into account by the use of rate and state friction law. Furthermore, our model differs from discrete models of faults (Ziv and Cochard, 2006) because the continuity allows us to define realistic geometries and distributions of asperities by the assembling of sub-critical computational cells that always fail in a single event. Moreover, this model allows us to adress the question of the influence of barriers and distribution of asperities on the event statistics. After recalling the main observations of asperities in the specific case of Parkfield segment of San-Andreas Fault, we analyse earthquake statistical properties computed for this area. Then, we present synthetic statistics obtained by our model that allow us to discuss the role of barriers on clustering and triggering phenomena among a population of sources. It appears that an effective size of barrier, that depends on its frictional strength, controls the presence or the absence, in the synthetic catalog, of statistical laws that are similar to what is observed for real earthquakes. As an application, we attempt to draw a comparison between synthetic statistics and the observed statistics of Parkfield in order to characterize what could be a realistic frictional model of Parkfield area. More generally, we obtained synthetic statistical properties that are in agreement with power-law decays characterized by exponents that match the observations at a global scale, showing that our mechanical model is able to provide new insights into the understanding of earthquake interaction processes in general.

  5. Origin of Pareto-like spatial distributions in ecosystems.

    PubMed

    Manor, Alon; Shnerb, Nadav M

    2008-12-31

    Recent studies of cluster distribution in various ecosystems revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this patch statistics is a manifestation of the law of proportionate effect. Mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (such as desertification) manifest themselves in a drastic change of the stability properties of spatial colonies.

  6. Origin of the correlations between exit times in pedestrian flows through a bottleneck

    NASA Astrophysics Data System (ADS)

    Nicolas, Alexandre; Touloupas, Ioannis

    2018-01-01

    Robust statistical features have emerged from the microscopic analysis of dense pedestrian flows through a bottleneck, notably with respect to the time gaps between successive passages. We pinpoint the mechanisms at the origin of these features thanks to simple models that we develop and analyse quantitatively. We disprove the idea that anticorrelations between successive time gaps (i.e. an alternation between shorter ones and longer ones) are a hallmark of a zipper-like intercalation of pedestrian lines and show that they simply result from the possibility that pedestrians from distinct ‘lines’ or directions cross the bottleneck within a short time interval. A second feature concerns the bursts of escapes, i.e. egresses that come in fast succession. Despite the ubiquity of exponential distributions of burst sizes, entailed by a Poisson process, we argue that anomalous (power-law) statistics arise if the bottleneck is nearly congested, albeit only in a tiny portion of parameter space. The generality of the proposed mechanisms implies that similar statistical features should also be observed for other types of particulate flows.

  7. Mechanical model for simulating the conditioning of air in the respiratory tract.

    PubMed

    Bergonse Neto, Nelson; Von Bahten, Luiz Carlos; Moura, Luís Mauro; Coelho, Marlos de Souza; Stori Junior, Wilson de Souza; Bergonse, Gilberto da Fontoura Rey

    2007-01-01

    To create a mechanical model that could be regulated to simulate the conditioning of inspired and expired air with the same normal values of temperature, pressure, and relative humidity as those of the respiratory system of a healthy young man on mechanical ventilation. Using several types of materials, a mechanical device was built and regulated using normal values of vital capacity, tidal volume, maximal inspiratory pressure, positive end-expiratory pressure, and gas temperature in the system. The device was submitted to mechanical ventilation for a period of 29.8 min. The changes in the temperature of the air circulating in the system were recorded every two seconds. The statistical analysis of the data collected revealed that the device was approximately as efficient in the conditioning of air as is the respiratory system of a human being. By the study endpoint, we had developed a mechanical device capable of simulating the conditioning of air in the respiratory tract. The device mimics the conditions of temperature, pressure, and relative humidity seen in the respiratory system of healthy individuals.

  8. ZERODUR strength modeling with Weibull statistical distributions

    NASA Astrophysics Data System (ADS)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a large data set. With only 20 specimens per sample such differentiation is not possible. This requires 100 specimens per set, the more the better. The validity of the statistical evaluation methods is discussed with several examples. These considerations are of special importance because of their consequences on the prognosis methods and results. Especially the use of the two parameter Weibull distribution for high strength surfaces has led to non-realistic results. Extrapolation down to low acceptable probability of failure covers a wide range without data points existing and is mainly influenced by the slope determined by the high strength specimens. In the past this misconception has prevented the use of brittle materials for stress loads, which they could have endured easily.

  9. A Computational Study of Plastic Deformation in AISI 304 Induced by Surface Mechanical Attrition Treatment

    NASA Astrophysics Data System (ADS)

    Zhang, X. C.; Lu, J.; Shi, S. Q.

    2010-05-01

    As a technique of grain refinement process by plastic deformation, surface mechanical attrition treatment (SMAT) has been developed to be one of the most effective ways to optimize the mechanical properties of various materials including pure metals and alloys. SMAT can significantly reduce grain size into nanometer regime in the surface layer of bulk materials, providing tremendous opportunities for improving physical, chemical and mechanical properties of the materials. In this work, a computational modeling of the surface mechanical attrition treatment (SMAT) process is presented, in which Johnson-Cook plasticity model and the finite element method were employed to study the high strain rate, elastic-plastic dynamic process of ball impact on a metallic target. AISI 304 steel with low stacking fault energy was chosen as the target material. First, a random impact model was used to analyze the statistic characteristics of ball impact, and then the plastic deformation behavior and residual stress distribution in AISI 304 stainless steel during SMAT were studied. The simulation results show that the compressive residual stress and vertical deformation of the surface structures were directly affected by ball impact frequency, incident impact angle and ball diameter used in SMAT process.

  10. Modeling the atmospheric chemistry of TICs

    NASA Astrophysics Data System (ADS)

    Henley, Michael V.; Burns, Douglas S.; Chynwat, Veeradej; Moore, William; Plitz, Angela; Rottmann, Shawn; Hearn, John

    2009-05-01

    An atmospheric chemistry model that describes the behavior and disposition of environmentally hazardous compounds discharged into the atmosphere was coupled with the transport and diffusion model, SCIPUFF. The atmospheric chemistry model was developed by reducing a detailed atmospheric chemistry mechanism to a simple empirical effective degradation rate term (keff) that is a function of important meteorological parameters such as solar flux, temperature, and cloud cover. Empirically derived keff functions that describe the degradation of target toxic industrial chemicals (TICs) were derived by statistically analyzing data generated from the detailed chemistry mechanism run over a wide range of (typical) atmospheric conditions. To assess and identify areas to improve the developed atmospheric chemistry model, sensitivity and uncertainty analyses were performed to (1) quantify the sensitivity of the model output (TIC concentrations) with respect to changes in the input parameters and (2) improve, where necessary, the quality of the input data based on sensitivity results. The model predictions were evaluated against experimental data. Chamber data were used to remove the complexities of dispersion in the atmosphere.

  11. STEP-TRAMM - A modeling interface for simulating localized rainfall induced shallow landslides and debris flow runout pathways

    NASA Astrophysics Data System (ADS)

    Or, D.; von Ruette, J.; Lehmann, P.

    2017-12-01

    Landslides and subsequent debris-flows initiated by rainfall represent a common natural hazard in mountainous regions. We integrated a landslide hydro-mechanical triggering model with a simple model for debris flow runout pathways and developed a graphical user interface (GUI) to represent these natural hazards at catchment scale at any location. The STEP-TRAMM GUI provides process-based estimates of the initiation locations and sizes of landslides patterns based on digital elevation models (SRTM) linked with high resolution global soil maps (SoilGrids 250 m resolution) and satellite based information on rainfall statistics for the selected region. In the preprocessing phase the STEP-TRAMM model estimates soil depth distribution to supplement other soil information for delineating key hydrological and mechanical properties relevant to representing local soil failure. We will illustrate this publicly available GUI and modeling platform to simulate effects of deforestation on landslide hazards in several regions and compare model outcome with satellite based information.

  12. Properties of JP=1/2+ baryon octets at low energy

    NASA Astrophysics Data System (ADS)

    Kaur, Amanpreet; Gupta, Pallavi; Upadhyay, Alka

    2017-06-01

    The statistical model in combination with the detailed balance principle is able to phenomenologically calculate and analyze spin- and flavor-dependent properties like magnetic moments (with effective masses, with effective charge, or with both effective mass and effective charge), quark spin polarization and distribution, the strangeness suppression factor, and \\overline{d}-\\overline{u} asymmetry incorporating the strange sea. The s\\overline{s} in the sea is said to be generated via the basic quark mechanism but suppressed by the strange quark mass factor ms>m_{u,d}. The magnetic moments of the octet baryons are analyzed within the statistical model, by putting emphasis on the SU(3) symmetry-breaking effects generated by the mass difference between the strange and non-strange quarks. The work presented here assumes hadrons with a sea having an admixture of quark gluon Fock states. The results obtained have been compared with theoretical models and experimental data.

  13. Statistical Modeling of Robotic Random Walks on Different Terrain

    NASA Astrophysics Data System (ADS)

    Naylor, Austin; Kinnaman, Laura

    Issues of public safety, especially with crowd dynamics and pedestrian movement, have been modeled by physicists using methods from statistical mechanics over the last few years. Complex decision making of humans moving on different terrains can be modeled using random walks (RW) and correlated random walks (CRW). The effect of different terrains, such as a constant increasing slope, on RW and CRW was explored. LEGO robots were programmed to make RW and CRW with uniform step sizes. Level ground tests demonstrated that the robots had the expected step size distribution and correlation angles (for CRW). The mean square displacement was calculated for each RW and CRW on different terrains and matched expected trends. The step size distribution was determined to change based on the terrain; theoretical predictions for the step size distribution were made for various simple terrains. It's Dr. Laura Kinnaman, not sure where to put the Prefix.

  14. Automatic stage identification of Drosophila egg chamber based on DAPI images

    PubMed Central

    Jia, Dongyu; Xu, Qiuping; Xie, Qian; Mio, Washington; Deng, Wu-Min

    2016-01-01

    The Drosophila egg chamber, whose development is divided into 14 stages, is a well-established model for developmental biology. However, visual stage determination can be a tedious, subjective and time-consuming task prone to errors. Our study presents an objective, reliable and repeatable automated method for quantifying cell features and classifying egg chamber stages based on DAPI images. The proposed approach is composed of two steps: 1) a feature extraction step and 2) a statistical modeling step. The egg chamber features used are egg chamber size, oocyte size, egg chamber ratio and distribution of follicle cells. Methods for determining the on-site of the polytene stage and centripetal migration are also discussed. The statistical model uses linear and ordinal regression to explore the stage-feature relationships and classify egg chamber stages. Combined with machine learning, our method has great potential to enable discovery of hidden developmental mechanisms. PMID:26732176

  15. Statistical summaries of fatigue data for design purposes

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1983-01-01

    Two methods are discussed for constructing a design curve on the safe side of fatigue data. Both the tolerance interval and equivalent prediction interval (EPI) concepts provide such a curve while accounting for both the distribution of the estimators in small samples and the data scatter. The EPI is also useful as a mechanism for providing necessary statistics on S-N data for a full reliability analysis which includes uncertainty in all fatigue design factors. Examples of statistical analyses of the general strain life relationship are presented. The tolerance limit and EPI techniques for defining a design curve are demonstrated. Examples usng WASPALOY B and RQC-100 data demonstrate that a reliability model could be constructed by considering the fatigue strength and fatigue ductility coefficients as two independent random variables. A technique given for establishing the fatigue strength for high cycle lives relies on an extrapolation technique and also accounts for "runners." A reliability model or design value can be specified.

  16. A statistical mechanics approach to computing rare transitions in multi-stable turbulent geophysical flows

    NASA Astrophysics Data System (ADS)

    Laurie, J.; Bouchet, F.

    2012-04-01

    Many turbulent flows undergo sporadic random transitions, after long periods of apparent statistical stationarity. For instance, paths of the Kuroshio [1], the Earth's magnetic field reversal, atmospheric flows [2], MHD experiments [3], 2D turbulence experiments [4,5], 3D flows [6] show this kind of behavior. The understanding of this phenomena is extremely difficult due to the complexity, the large number of degrees of freedom, and the non-equilibrium nature of these turbulent flows. It is however a key issue for many geophysical problems. A straightforward study of these transitions, through a direct numerical simulation of the governing equations, is nearly always impracticable. This is mainly a complexity problem, due to the large number of degrees of freedom involved for genuine turbulent flows, and the extremely long time between two transitions. In this talk, we consider two-dimensional and geostrophic turbulent models, with stochastic forces. We consider regimes where two or more attractors coexist. As an alternative to direct numerical simulation, we propose a non-equilibrium statistical mechanics approach to the computation of this phenomenon. Our strategy is based on large deviation theory [7], derived from a path integral representation of the stochastic process. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable one. Moreover, we also determine the transition rates, and in which cases this most probable trajectory is a typical one. Interestingly, we prove that in the class of models we consider, a mechanism exists for diffusion over sets of connected attractors. For the type of stochastic forces that allows this diffusion, the transition between attractors is not a rare event. It is then very difficult to characterize the flow as bistable. However for another class of stochastic forces, this diffusion mechanism is prevented, and genuine bistability or multi-stability is observed. We discuss how these results are probably connected to the long debated existence of multi-stability in the atmosphere and oceans.

  17. Applied immuno-epidemiological research: an approach for integrating existing knowledge into the statistical analysis of multiple immune markers.

    PubMed

    Genser, Bernd; Fischer, Joachim E; Figueiredo, Camila A; Alcântara-Neves, Neuza; Barreto, Mauricio L; Cooper, Philip J; Amorim, Leila D; Saemann, Marcus D; Weichhart, Thomas; Rodrigues, Laura C

    2016-05-20

    Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes.

  18. A Theory of Density Layering in Stratified Turbulence using Statistical State Dynamics

    NASA Astrophysics Data System (ADS)

    Fitzgerald, J.; Farrell, B.

    2016-12-01

    Stably stratified turbulent fluids commonly develop density structures that are layered in the vertical direction (e.g., Manucharyan et al., 2015). Within layers, density is approximately constant and stratification is weak. Between layers, density varies rapidly and stratification is strong. A common explanation for the existence of layers invokes the negative diffusion mechanism of Phillips (1972) & Posmentier (1977). The physical principle underlying this mechanism is that the flux-gradient relationship connecting the turbulent fluxes of buoyancy to the background stratification must have the special property of weakening fluxes with strengthening gradient. Under these conditions, the evolution of the stratification is governed by a negative diffusion problem which gives rise to spontaneous layer formation. In previous work on stratified layering, this flux-gradient property is often assumed (e.g, Posmentier, 1977) or drawn from phenomenological models of turbulence (e.g., Balmforth et al., 1998).In this work we develop the theoretical underpinnings of layer formation by applying stochastic turbulence modeling and statistical state dynamics (SSD) to predict the flux-gradient relation and analyze layer formation directly from the equations of motion. We show that for stochastically-forced homogeneous 2D Boussinesq turbulence, the flux-gradient relation can be obtained analytically and indicates that the fluxes always strengthen with stratification. The Phillips mechanism thus does not operate in this maximally simplified scenario. However, when the problem is augmented to include a large scale background shear, we show that the flux-gradient relationship is modified so that the fluxes weaken with stratification. Sheared and stratified 2D Boussinesq turbulence thus spontaneously forms density layers through the Phillips mechanism. Using SSD (Farrell & Ioannou 2003), we obtain a closed, deterministic dynamics for the stratification and the statistical turbulent state. We show that density layers form as a linear instability of the sheared turbulence, associated with a supercritical bifurcation. We further show that SSD predicts the nonlinear equilibration and maintenance of the layers, and captures the phenomena of layer growth and mergers (Radko, 2007).

  19. Competing Thermodynamic and Dynamic Factors Select Molecular Assemblies on a Gold Surface

    NASA Astrophysics Data System (ADS)

    Haxton, Thomas K.; Zhou, Hui; Tamblyn, Isaac; Eom, Daejin; Hu, Zonghai; Neaton, Jeffrey B.; Heinz, Tony F.; Whitelam, Stephen

    2013-12-01

    Controlling the self-assembly of surface-adsorbed molecules into nanostructures requires understanding physical mechanisms that act across multiple length and time scales. By combining scanning tunneling microscopy with hierarchical ab initio and statistical mechanical modeling of 1,4-substituted benzenediamine (BDA) molecules adsorbed on a gold (111) surface, we demonstrate that apparently simple nanostructures are selected by a subtle competition of thermodynamics and dynamics. Of the collection of possible BDA nanostructures mechanically stabilized by hydrogen bonding, the interplay of intermolecular forces, surface modulation, and assembly dynamics select at low temperature a particular subset: low free energy oriented linear chains of monomers and high free energy branched chains.

  20. Modeling the complexity of acoustic emission during intermittent plastic deformation: Power laws and multifractal spectra

    NASA Astrophysics Data System (ADS)

    Kumar, Jagadish; Ananthakrishna, G.

    2018-01-01

    Scale-invariant power-law distributions for acoustic emission signals are ubiquitous in several plastically deforming materials. However, power-law distributions for acoustic emission energies are reported in distinctly different plastically deforming situations such as hcp and fcc single and polycrystalline samples exhibiting smooth stress-strain curves and in dilute metallic alloys exhibiting discontinuous flow. This is surprising since the underlying dislocation mechanisms in these two types of deformations are very different. So far, there have been no models that predict the power-law statistics for discontinuous flow. Furthermore, the statistics of the acoustic emission signals in jerky flow is even more complex, requiring multifractal measures for a proper characterization. There has been no model that explains the complex statistics either. Here we address the problem of statistical characterization of the acoustic emission signals associated with the three types of the Portevin-Le Chatelier bands. Following our recently proposed general framework for calculating acoustic emission, we set up a wave equation for the elastic degrees of freedom with a plastic strain rate as a source term. The energy dissipated during acoustic emission is represented by the Rayleigh-dissipation function. Using the plastic strain rate obtained from the Ananthakrishna model for the Portevin-Le Chatelier effect, we compute the acoustic emission signals associated with the three Portevin-Le Chatelier bands and the Lüders-like band. The so-calculated acoustic emission signals are used for further statistical characterization. Our results show that the model predicts power-law statistics for all the acoustic emission signals associated with the three types of Portevin-Le Chatelier bands with the exponent values increasing with increasing strain rate. The calculated multifractal spectra corresponding to the acoustic emission signals associated with the three band types have a maximum spread for the type C bands and decreasing with types B and A. We further show that the acoustic emission signals associated with Lüders-like band also exhibit a power-law distribution and multifractality.

  1. Modeling the complexity of acoustic emission during intermittent plastic deformation: Power laws and multifractal spectra.

    PubMed

    Kumar, Jagadish; Ananthakrishna, G

    2018-01-01

    Scale-invariant power-law distributions for acoustic emission signals are ubiquitous in several plastically deforming materials. However, power-law distributions for acoustic emission energies are reported in distinctly different plastically deforming situations such as hcp and fcc single and polycrystalline samples exhibiting smooth stress-strain curves and in dilute metallic alloys exhibiting discontinuous flow. This is surprising since the underlying dislocation mechanisms in these two types of deformations are very different. So far, there have been no models that predict the power-law statistics for discontinuous flow. Furthermore, the statistics of the acoustic emission signals in jerky flow is even more complex, requiring multifractal measures for a proper characterization. There has been no model that explains the complex statistics either. Here we address the problem of statistical characterization of the acoustic emission signals associated with the three types of the Portevin-Le Chatelier bands. Following our recently proposed general framework for calculating acoustic emission, we set up a wave equation for the elastic degrees of freedom with a plastic strain rate as a source term. The energy dissipated during acoustic emission is represented by the Rayleigh-dissipation function. Using the plastic strain rate obtained from the Ananthakrishna model for the Portevin-Le Chatelier effect, we compute the acoustic emission signals associated with the three Portevin-Le Chatelier bands and the Lüders-like band. The so-calculated acoustic emission signals are used for further statistical characterization. Our results show that the model predicts power-law statistics for all the acoustic emission signals associated with the three types of Portevin-Le Chatelier bands with the exponent values increasing with increasing strain rate. The calculated multifractal spectra corresponding to the acoustic emission signals associated with the three band types have a maximum spread for the type C bands and decreasing with types B and A. We further show that the acoustic emission signals associated with Lüders-like band also exhibit a power-law distribution and multifractality.

  2. Mechanism of the reaction, CH4+O(1D2)→CH3+OH, studied by ultrafast and state-resolved photolysis/probe spectroscopy of the CH4ṡO3 van der Waals complex

    NASA Astrophysics Data System (ADS)

    Miller, C. Cameron; van Zee, Roger D.; Stephenson, John C.

    2001-01-01

    The mechanism of the reaction CH4+O(1D2)→CH3+OH was investigated by ultrafast, time-resolved and state-resolved experiments. In the ultrafast experiments, short ultraviolet pulses photolyzed ozone in the CH4ṡO3 van der Waals complex to produce O(1D2). The ensuing reaction with CH4 was monitored by measuring the appearance rate of OH(v=0,1;J,Ω,Λ) by laser-induced fluorescence, through the OH A←X transition, using short probe pulses. These spectrally broad pulses, centered between 307 and 316 nm, probe many different OH rovibrational states simultaneously. At each probe wavelength, both a fast and a slow rise time were evident in the fluorescence signal, and the ratio of the fast-to-slow signal varied with probe wavelength. The distribution of OH(v,J,Ω,Λ) states, Pobs(v,J,Ω,Λ), was determined by laser-induced fluorescence using a high-resolution, tunable dye laser. The Pobs(v,J,Ω,Λ) data and the time-resolved data were analyzed under the assumption that different formation times represent different reaction mechanisms and that each mechanism produces a characteristic rovibrational distribution. The state-resolved and the time-resolved data can be fit independently using a two-mechanism model: Pobs(v,J,Ω,Λ) can be decomposed into two components, and the appearance of OH can be fit by two exponential rise times. However, these independent analyses are not mutually consistent. The time-resolved and state-resolved data can be consistently fit using a three-mechanism model. The OH appearance signals, at all probe wavelengths, were fit with times τfast≈0.2 ps, τinter≈0.5 ps and τslow≈5.4 ps. The slowest of these three is the rate for dissociation of a vibrationally excited methanol intermediate (CH3OH*) predicted by statistical theory after complete intramolecular energy redistribution following insertion of O(1D2) into CH4. The Pobs(v,J,Ω,Λ) was decomposed into three components, each with a linear surprisal, under the assumption that the mechanism producing OH at a statistical rate would be characterized by a statistical prior. Dissociation of a CH4O* intermediate before complete energy randomization was identified as producing OH at the intermediate rate and was associated with a population distribution with more rovibrational energy than the slow mechanism. The third mechanism produces OH promptly with a cold rovibrational distribution, indicative of a collinear abstraction mechanism. After these identifications were made, it was possible to predict the fraction of signal associated with each mechanism at different probe wavelengths in the ultrafast experiment, and the predictions proved consistent with measured appearance signals. This model also reconciles data from a variety of previous experiments. While this model is the simplest that is consistent with the data, it is not definitive for several reasons. First, the appearance signals measured in these experiments probe simultaneously many OH(v,J,Ω,Λ) states, which would tend to obfuscate differences in the appearance rate of specific rovibrational states. Second, only about half of the OH(v,J,Ω,Λ) states populated by this reaction could be probed by laser-induced fluorescence through the OH A←X band with our apparatus. Third, the cluster environment might influence the dynamics compared to the free bimolecular reaction.

  3. Stochastic Short-term High-resolution Prediction of Solar Irradiance and Photovoltaic Power Output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melin, Alexander M.; Olama, Mohammed M.; Dong, Jin

    The increased penetration of solar photovoltaic (PV) energy sources into electric grids has increased the need for accurate modeling and prediction of solar irradiance and power production. Existing modeling and prediction techniques focus on long-term low-resolution prediction over minutes to years. This paper examines the stochastic modeling and short-term high-resolution prediction of solar irradiance and PV power output. We propose a stochastic state-space model to characterize the behaviors of solar irradiance and PV power output. This prediction model is suitable for the development of optimal power controllers for PV sources. A filter-based expectation-maximization and Kalman filtering mechanism is employed tomore » estimate the parameters and states in the state-space model. The mechanism results in a finite dimensional filter which only uses the first and second order statistics. The structure of the scheme contributes to a direct prediction of the solar irradiance and PV power output without any linearization process or simplifying assumptions of the signal’s model. This enables the system to accurately predict small as well as large fluctuations of the solar signals. The mechanism is recursive allowing the solar irradiance and PV power to be predicted online from measurements. The mechanism is tested using solar irradiance and PV power measurement data collected locally in our lab.« less

  4. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the modelmore » to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.« less

  5. Thermodynamic evolution far from equilibrium

    NASA Astrophysics Data System (ADS)

    Khantuleva, Tatiana A.

    2018-05-01

    The presented model of thermodynamic evolution of an open system far from equilibrium is based on the modern results of nonequilibrium statistical mechanics, the nonlocal theory of nonequilibrium transport developed by the author and the Speed Gradient principle introduced in the theory of adaptive control. Transition to a description of the system internal structure evolution at the mesoscopic level allows a new insight at the stability problem of non-equilibrium processes. The new model is used in a number of specific tasks.

  6. Enhanced Deformation of Azobenzene-Modified Liquid Crystal Polymers under Dual Wavelength Exposure: A Photophysical Model

    NASA Astrophysics Data System (ADS)

    Liu, Ling; Onck, Patrick R.

    2017-08-01

    Azobenzene-embedded liquid crystal polymers can undergo mechanical deformation in response to ultraviolet (UV) light. The natural rodlike trans state azobenzene absorbs UV light and isomerizes to a bentlike cis state, which disturbs the order of the polymer network, leading to an anisotropic deformation. The current consensus is that the magnitude of the photoinduced deformation is related to the statistical building up of molecules in the cis state. However, a recent experimental study [Liu and Broer, Nat. Commun. 6 8334 (2015)., 10.1038/ncomms9334] shows that a drastic (fourfold) increase of the photoinduced deformation can be generated by exposing the samples simultaneously to 365 nm (UV) and 455 nm (visible) light. To elucidate the physical mechanism that drives this increase, we develop a two-light attenuation model and an optomechanical constitutive relation that not only accounts for the statistical accumulation of cis azobenzenes, but also for the dynamic trans-cis-trans oscillatory isomerization process. Our experimentally calibrated model predicts that the optimal single-wavelength exposure is 395 nm light, a pronounced shift towards the visible spectrum. In addition, we identify a range of optimal combinations of two-wavelength lights that generate a favorable response for a given amount of injected energy. Our model provides mechanistic insight into the different (multi)wavelength exposures used in experiments and, at the same time, opens new avenues towards enhanced, multiwavelength optomechanical behavior.

  7. The statistical mechanics of complex signaling networks: nerve growth factor signaling

    NASA Astrophysics Data System (ADS)

    Brown, K. S.; Hill, C. C.; Calero, G. A.; Myers, C. R.; Lee, K. H.; Sethna, J. P.; Cerione, R. A.

    2004-10-01

    The inherent complexity of cellular signaling networks and their importance to a wide range of cellular functions necessitates the development of modeling methods that can be applied toward making predictions and highlighting the appropriate experiments to test our understanding of how these systems are designed and function. We use methods of statistical mechanics to extract useful predictions for complex cellular signaling networks. A key difficulty with signaling models is that, while significant effort is being made to experimentally measure the rate constants for individual steps in these networks, many of the parameters required to describe their behavior remain unknown or at best represent estimates. To establish the usefulness of our approach, we have applied our methods toward modeling the nerve growth factor (NGF)-induced differentiation of neuronal cells. In particular, we study the actions of NGF and mitogenic epidermal growth factor (EGF) in rat pheochromocytoma (PC12) cells. Through a network of intermediate signaling proteins, each of these growth factors stimulates extracellular regulated kinase (Erk) phosphorylation with distinct dynamical profiles. Using our modeling approach, we are able to predict the influence of specific signaling modules in determining the integrated cellular response to the two growth factors. Our methods also raise some interesting insights into the design and possible evolution of cellular systems, highlighting an inherent property of these systems that we call 'sloppiness.'

  8. Biophysically inspired model for functionalized nanocarrier adhesion to cell surface: roles of protein expression and mechanical factors

    NASA Astrophysics Data System (ADS)

    Ramakrishnan, N.; Tourdot, Richard W.; Eckmann, David M.; Ayyaswamy, Portonovo S.; Muzykantov, Vladimir R.; Radhakrishnan, Ravi

    2016-06-01

    In order to achieve selective targeting of affinity-ligand coated nanoparticles to the target tissue, it is essential to understand the key mechanisms that govern their capture by the target cell. Next-generation pharmacokinetic (PK) models that systematically account for proteomic and mechanical factors can accelerate the design, validation and translation of targeted nanocarriers (NCs) in the clinic. Towards this objective, we have developed a computational model to delineate the roles played by target protein expression and mechanical factors of the target cell membrane in determining the avidity of functionalized NCs to live cells. Model results show quantitative agreement with in vivo experiments when specific and non-specific contributions to NC binding are taken into account. The specific contributions are accounted for through extensive simulations of multivalent receptor-ligand interactions, membrane mechanics and entropic factors such as membrane undulations and receptor translation. The computed NC avidity is strongly dependent on ligand density, receptor expression, bending mechanics of the target cell membrane, as well as entropic factors associated with the membrane and the receptor motion. Our computational model can predict the in vivo targeting levels of the intracellular adhesion molecule-1 (ICAM1)-coated NCs targeted to the lung, heart, kidney, liver and spleen of mouse, when the contributions due to endothelial capture are accounted for. The effect of other cells (such as monocytes, etc.) do not improve the model predictions at steady state. We demonstrate the predictive utility of our model by predicting partitioning coefficients of functionalized NCs in mice and human tissues and report the statistical accuracy of our model predictions under different scenarios.

  9. Challenges in developing methods for quantifying the effects of weather and climate on water-associated diseases: A systematic review

    PubMed Central

    Armstrong, Ben; Fleming, Lora E.; Elson, Richard; Kovats, Sari; Vardoulakis, Sotiris; Nichols, Gordon L.

    2017-01-01

    Infectious diseases attributable to unsafe water supply, sanitation and hygiene (e.g. Cholera, Leptospirosis, Giardiasis) remain an important cause of morbidity and mortality, especially in low-income countries. Climate and weather factors are known to affect the transmission and distribution of infectious diseases and statistical and mathematical modelling are continuously developing to investigate the impact of weather and climate on water-associated diseases. There have been little critical analyses of the methodological approaches. Our objective is to review and summarize statistical and modelling methods used to investigate the effects of weather and climate on infectious diseases associated with water, in order to identify limitations and knowledge gaps in developing of new methods. We conducted a systematic review of English-language papers published from 2000 to 2015. Search terms included concepts related to water-associated diseases, weather and climate, statistical, epidemiological and modelling methods. We found 102 full text papers that met our criteria and were included in the analysis. The most commonly used methods were grouped in two clusters: process-based models (PBM) and time series and spatial epidemiology (TS-SE). In general, PBM methods were employed when the bio-physical mechanism of the pathogen under study was relatively well known (e.g. Vibrio cholerae); TS-SE tended to be used when the specific environmental mechanisms were unclear (e.g. Campylobacter). Important data and methodological challenges emerged, with implications for surveillance and control of water-associated infections. The most common limitations comprised: non-inclusion of key factors (e.g. biological mechanism, demographic heterogeneity, human behavior), reporting bias, poor data quality, and collinearity in exposures. Furthermore, the methods often did not distinguish among the multiple sources of time-lags (e.g. patient physiology, reporting bias, healthcare access) between environmental drivers/exposures and disease detection. Key areas of future research include: disentangling the complex effects of weather/climate on each exposure-health outcome pathway (e.g. person-to-person vs environment-to-person), and linking weather data to individual cases longitudinally. PMID:28604791

  10. Challenges in developing methods for quantifying the effects of weather and climate on water-associated diseases: A systematic review.

    PubMed

    Lo Iacono, Giovanni; Armstrong, Ben; Fleming, Lora E; Elson, Richard; Kovats, Sari; Vardoulakis, Sotiris; Nichols, Gordon L

    2017-06-01

    Infectious diseases attributable to unsafe water supply, sanitation and hygiene (e.g. Cholera, Leptospirosis, Giardiasis) remain an important cause of morbidity and mortality, especially in low-income countries. Climate and weather factors are known to affect the transmission and distribution of infectious diseases and statistical and mathematical modelling are continuously developing to investigate the impact of weather and climate on water-associated diseases. There have been little critical analyses of the methodological approaches. Our objective is to review and summarize statistical and modelling methods used to investigate the effects of weather and climate on infectious diseases associated with water, in order to identify limitations and knowledge gaps in developing of new methods. We conducted a systematic review of English-language papers published from 2000 to 2015. Search terms included concepts related to water-associated diseases, weather and climate, statistical, epidemiological and modelling methods. We found 102 full text papers that met our criteria and were included in the analysis. The most commonly used methods were grouped in two clusters: process-based models (PBM) and time series and spatial epidemiology (TS-SE). In general, PBM methods were employed when the bio-physical mechanism of the pathogen under study was relatively well known (e.g. Vibrio cholerae); TS-SE tended to be used when the specific environmental mechanisms were unclear (e.g. Campylobacter). Important data and methodological challenges emerged, with implications for surveillance and control of water-associated infections. The most common limitations comprised: non-inclusion of key factors (e.g. biological mechanism, demographic heterogeneity, human behavior), reporting bias, poor data quality, and collinearity in exposures. Furthermore, the methods often did not distinguish among the multiple sources of time-lags (e.g. patient physiology, reporting bias, healthcare access) between environmental drivers/exposures and disease detection. Key areas of future research include: disentangling the complex effects of weather/climate on each exposure-health outcome pathway (e.g. person-to-person vs environment-to-person), and linking weather data to individual cases longitudinally.

  11. Numerical and Statistical Analysis of Fractures in Mechanically Dissimilar Rocks of Limestone Interbedded with Shale from Nash Point in Bristol Channel, South Wales, UK.

    NASA Astrophysics Data System (ADS)

    Adeoye-Akinde, K.; Gudmundsson, A.

    2017-12-01

    Heterogeneity and anisotropy, especially with layered strata within the same reservoir, makes the geometry and permeability of an in-situ fracture network challenging to forecast. This study looks at outcrops analogous to reservoir rocks for a better understanding of in-situ fracture networks and permeability, especially fracture formation, propagation, and arrest/deflection. Here, fracture geometry (e.g. length and aperture) from interbedded limestone and shale is combined with statistical and numerical modelling (using the Finite Element Method) to better forecast fracture network properties and permeability. The main aim is to bridge the gap between fracture data obtained at the core level (cm-scale) and at the seismic level (km-scale). Analysis has been made of geometric properties of over 250 fractures from the blue Lias in Nash Point, UK. As fractures propagate, energy is required to keep them going, and according to the laws of thermodynamics, this energy can be linked to entropy. As fractures grow, entropy increases, therefore, the result shows a strong linear correlation between entropy and the scaling exponent of fracture length and aperture-size distributions. Modelling is used to numerically simulate the stress/fracture behaviour in mechanically dissimilar rocks. Results show that the maximum principal compressive stress orientation changes in the host rock as the fracture-induced stress tip moves towards a more compliant (shale) layer. This behaviour can be related to the three mechanisms of fracture arrest/deflection at an interface, namely: elastic mismatch, stress barrier and Cook-Gordon debonding. Tensile stress concentrates at the contact between the stratigraphic layers, ahead of and around the propagating fracture. However, as shale stiffens with time, the stresses concentrated at the contact start to dissipate into it. This can happen in nature through diagenesis, and with greater depth of burial. This study also investigates how induced fractures propagate and interact with existing discontinuities in layered rocks using analogue modelling. Further work will introduce the Maximum Entropy Method for more accurate statistical modelling. This method is mainly useful to forecast likely fracture-size probability distributions from incomplete subsurface information.

  12. A practical approach for the scale-up of roller compaction process.

    PubMed

    Shi, Weixian; Sprockel, Omar L

    2016-09-01

    An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Direct computational approach to lattice supersymmetric quantum mechanics

    NASA Astrophysics Data System (ADS)

    Kadoh, Daisuke; Nakayama, Katsumasa

    2018-07-01

    We study the lattice supersymmetric models numerically using the transfer matrix approach. This method consists only of deterministic processes and has no statistical uncertainties. We improve it by performing a scale transformation of variables such that the Witten index is correctly reproduced from the lattice model, and the other prescriptions are shown in detail. Compared to the precious Monte-Carlo results, we can estimate the effective masses, SUSY Ward identity and the cut-off dependence of the results in high precision. Those kinds of information are useful in improving lattice formulation of supersymmetric models.

  14. Stability estimation of autoregulated genes under Michaelis-Menten-type kinetics

    NASA Astrophysics Data System (ADS)

    Arani, Babak M. S.; Mahmoudi, Mahdi; Lahti, Leo; González, Javier; Wit, Ernst C.

    2018-06-01

    Feedback loops are typical motifs appearing in gene regulatory networks. In some well-studied model organisms, including Escherichia coli, autoregulated genes, i.e., genes that activate or repress themselves through their protein products, are the only feedback interactions. For these types of interactions, the Michaelis-Menten (MM) formulation is a suitable and widely used approach, which always leads to stable steady-state solutions representative of homeostatic regulation. However, in many other biological phenomena, such as cell differentiation, cancer progression, and catastrophes in ecosystems, one might expect to observe bistable switchlike dynamics in the case of strong positive autoregulation. To capture this complex behavior we use the generalized family of MM kinetic models. We give a full analysis regarding the stability of autoregulated genes. We show that the autoregulation mechanism has the capability to exhibit diverse cellular dynamics including hysteresis, a typical characteristic of bistable systems, as well as irreversible transitions between bistable states. We also introduce a statistical framework to estimate the kinetics parameters and probability of different stability regimes given observational data. Empirical data for the autoregulated gene SCO3217 in the SOS system in Streptomyces coelicolor are analyzed. The coupling of a statistical framework and the mathematical model can give further insight into understanding the evolutionary mechanisms toward different cell fates in various systems.

  15. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA.

    PubMed

    Wu, Zheyang; Zhao, Hongyu

    2012-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.

  16. Statistical mechanics of shell models for two-dimensional turbulence

    NASA Astrophysics Data System (ADS)

    Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.

    1994-12-01

    We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.

  17. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA

    PubMed Central

    Wu, Zheyang; Zhao, Hongyu

    2013-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610

  18. Independence polynomial and matching polynomial of the Koch network

    NASA Astrophysics Data System (ADS)

    Liao, Yunhua; Xie, Xiaoliang

    2015-11-01

    The lattice gas model and the monomer-dimer model are two classical models in statistical mechanics. It is well known that the partition functions of these two models are associated with the independence polynomial and the matching polynomial in graph theory, respectively. Both polynomials have been shown to belong to the “#P-complete” class, which indicate the problems are computationally “intractable”. We consider these two polynomials of the Koch networks which are scale-free with small-world effects. Explicit recurrences are derived, and explicit formulae are presented for the number of independent sets of a certain type.

  19. Colloquium: Statistical mechanics of money, wealth, and income

    NASA Astrophysics Data System (ADS)

    Yakovenko, Victor M.; Rosser, J. Barkley, Jr.

    2009-10-01

    This Colloquium reviews statistical models for money, wealth, and income distributions developed in the econophysics literature since the late 1990s. By analogy with the Boltzmann-Gibbs distribution of energy in physics, it is shown that the probability distribution of money is exponential for certain classes of models with interacting economic agents. Alternative scenarios are also reviewed. Data analysis of the empirical distributions of wealth and income reveals a two-class distribution. The majority of the population belongs to the lower class, characterized by the exponential (“thermal”) distribution, whereas a small fraction of the population in the upper class is characterized by the power-law (“superthermal”) distribution. The lower part is very stable, stationary in time, whereas the upper part is highly dynamical and out of equilibrium.

  20. An investigation into the causes of stratospheric ozone loss in the southern Australasian region

    NASA Astrophysics Data System (ADS)

    Lehmann, P.; Karoly, D. J.; Newmann, P. A.; Clarkson, T. S.; Matthews, W. A.

    1992-07-01

    Measurements of total ozone at Macquarie Island (55 deg S, 159 deg E) reveal statistically significant reductions of approximately twelve percent during July to September when comparing the mean levels for 1987-90 with those in the seventies. In order to investigate the possibility that these ozone changes may not be a result of dynamic variability of the stratosphere, a simple linear model of ozone was created from statistical analysis of tropopause height and isentropic transient eddy heat flux, which were assumed representative of the dominant dynamic influences. Comparison of measured and modeled ozone indicates that the recent downward trend in ozone at Macquarie Island is not related to stratospheric dynamic variability and therefore suggests another mechanism, possibly changes in photochemical destruction of ozone.

  1. Toughness and strength of nanocrystalline graphene

    DOE PAGES

    Shekhawat, Ashivni; Ritchie, Robert O.

    2016-01-28

    Pristine monocrystalline graphene is claimed to be the strongest material known with remarkable mechanical and electrical properties. However, graphene made with scalable fabrication techniques is polycrystalline and contains inherent nanoscale line and point defects—grain boundaries and grain-boundary triple junctions—that lead to significant statistical fluctuations in toughness and strength. These fluctuations become particularly pronounced for nanocrystalline graphene where the density of defects is high. Here we use large-scale simulation and continuum modelling to show that the statistical variation in toughness and strength can be understood with ‘weakest-link’ statistics. We develop the first statistical theory of toughness in polycrystalline graphene, and elucidatemore » the nanoscale origins of the grain-size dependence of its strength and toughness. Lastly, our results should lead to more reliable graphene device design, and provide a framework to interpret experimental results in a broad class of two-dimensional materials.« less

  2. Ergodic theorem, ergodic theory, and statistical mechanics

    PubMed Central

    Moore, Calvin C.

    2015-01-01

    This perspective highlights the mean ergodic theorem established by John von Neumann and the pointwise ergodic theorem established by George Birkhoff, proofs of which were published nearly simultaneously in PNAS in 1931 and 1932. These theorems were of great significance both in mathematics and in statistical mechanics. In statistical mechanics they provided a key insight into a 60-y-old fundamental problem of the subject—namely, the rationale for the hypothesis that time averages can be set equal to phase averages. The evolution of this problem is traced from the origins of statistical mechanics and Boltzman's ergodic hypothesis to the Ehrenfests' quasi-ergodic hypothesis, and then to the ergodic theorems. We discuss communications between von Neumann and Birkhoff in the Fall of 1931 leading up to the publication of these papers and related issues of priority. These ergodic theorems initiated a new field of mathematical-research called ergodic theory that has thrived ever since, and we discuss some of recent developments in ergodic theory that are relevant for statistical mechanics. PMID:25691697

  3. Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Wang, Jun

    2012-10-01

    The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.

  4. Statistical mechanics of two-dimensional shuffled foams: Geometry-topology correlation in small or large disorder limits

    NASA Astrophysics Data System (ADS)

    Durand, Marc; Kraynik, Andrew M.; van Swol, Frank; Käfer, Jos; Quilliet, Catherine; Cox, Simon; Ataei Talebi, Shirin; Graner, François

    2014-06-01

    Bubble monolayers are model systems for experiments and simulations of two-dimensional packing problems of deformable objects. We explore the relation between the distributions of the number of bubble sides (topology) and the bubble areas (geometry) in the low liquid fraction limit. We use a statistical model [M. Durand, Europhys. Lett. 90, 60002 (2010), 10.1209/0295-5075/90/60002] which takes into account Plateau laws. We predict the correlation between geometrical disorder (bubble size dispersity) and topological disorder (width of bubble side number distribution) over an extended range of bubble size dispersities. Extensive data sets arising from shuffled foam experiments, surface evolver simulations, and cellular Potts model simulations all collapse surprisingly well and coincide with the model predictions, even at extremely high size dispersity. At moderate size dispersity, we recover our earlier approximate predictions [M. Durand, J. Kafer, C. Quilliet, S. Cox, S. A. Talebi, and F. Graner, Phys. Rev. Lett. 107, 168304 (2011), 10.1103/PhysRevLett.107.168304]. At extremely low dispersity, when approaching the perfectly regular honeycomb pattern, we study how both geometrical and topological disorders vanish. We identify a crystallization mechanism and explore it quantitatively in the case of bidisperse foams. Due to the deformability of the bubbles, foams can crystallize over a larger range of size dispersities than hard disks. The model predicts that the crystallization transition occurs when the ratio of largest to smallest bubble radii is 1.4.

  5. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    PubMed

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  6. Extreme value statistics analysis of fracture strengths of a sintered silicon nitride failing from pores

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1992-01-01

    Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.

  7. A statistical analogy between collapse of solids and death of living organisms: proposal for a 'law of life'.

    PubMed

    Pugno, Nicola M

    2007-01-01

    In this paper we present a statistical analogy between the collapse of solids and living organisms; in particular we deduce a statistical law governing their probability of death. We have derived such a law coupling the widely used Weibull Statistics, developed for describing the distribution of the strength of solids, with a general model for ontogenetic growth recently proposed in literature. The main idea presented in this paper is that cracks can propagate in solids and cause their failure as sick cells in living organisms can cause their death. Making a rough analogy, living organisms are found to behave as "growing" mechanical components under cyclic, i.e., fatigue, loadings and composed by a dynamic evolutionary material that, as an ineluctable fate, deteriorates. The implications on biological scaling laws are discussed. As an example, we apply such a Dynamic Weibull Statistics to large data collections on human deaths due to cancer of various types recorded in Italy: a significant agreement is observed.

  8. Predictors of Mortality in the Critically Ill Cirrhotic Patient: Is the Model for End-Stage Liver Disease Enough?

    PubMed

    Annamalai, Alagappan; Harada, Megan Y; Chen, Melissa; Tran, Tram; Ko, Ara; Ley, Eric J; Nuno, Miriam; Klein, Andrew; Nissen, Nicholas; Noureddin, Mazen

    2017-03-01

    Critically ill cirrhotics require liver transplantation urgently, but are at high risk for perioperative mortality. The Model for End-stage Liver Disease (MELD) score, recently updated to incorporate serum sodium, estimates survival probability in patients with cirrhosis, but needs additional evaluation in the critically ill. The purpose of this study was to evaluate the predictive power of ICU admission MELD scores and identify clinical risk factors associated with increased mortality. This was a retrospective review of cirrhotic patients admitted to the ICU between January 2011 and December 2014. Patients who were discharged or underwent transplantation (survivors) were compared with those who died (nonsurvivors). Demographic characteristics, admission MELD scores, and clinical risk factors were recorded. Multivariate regression was used to identify independent predictors of mortality, and measures of model performance were assessed to determine predictive accuracy. Of 276 patients who met inclusion criteria, 153 were considered survivors and 123 were nonsurvivors. Survivor and nonsurvivor cohorts had similar demographic characteristics. Nonsurvivors had increased MELD, gastrointestinal bleeding, infection, mechanical ventilation, encephalopathy, vasopressors, dialysis, renal replacement therapy, requirement of blood products, and ICU length of stay. The MELD demonstrated low predictive power (c-statistic 0.73). Multivariate analysis identified MELD score (adjusted odds ratio [AOR] = 1.05), mechanical ventilation (AOR = 4.55), vasopressors (AOR = 3.87), and continuous renal replacement therapy (AOR = 2.43) as independent predictors of mortality, with stronger predictive accuracy (c-statistic 0.87). The MELD demonstrated relatively poor predictive accuracy in critically ill patients with cirrhosis and might not be the best indicator for prognosis in the ICU population. Prognostic accuracy is significantly improved when variables indicating organ support (mechanical ventilation, vasopressors, and continuous renal replacement therapy) are included in the model. Copyright © 2016. Published by Elsevier Inc.

  9. Generalized self-adjustment method for statistical mechanics of composite materials

    NASA Astrophysics Data System (ADS)

    Pan'kov, A. A.

    1997-03-01

    A new method is developed for the statistical mechanics of composite materials — the generalized selfadjustment method — which makes it possible to reduce the problem of predicting effective elastic properties of composites with random structures to the solution of two simpler "averaged" problems of an inclusion with transitional layers in a medium with the desired effective elastic properties. The inhomogeneous elastic properties and dimensions of the transitional layers take into account both the "approximate" order of mutual positioning, and also the variation in the dimensions and elastics properties of inclusions through appropriate special averaged indicator functions of the random structure of the composite. A numerical calculation of averaged indicator functions and effective elastic characteristics is performed by the generalized self-adjustment method for a unidirectional fiberglass on the basis of various models of actual random structures in the plane of isotropy.

  10. Fluorescent biopsy of biological tissues in differentiation of benign and malignant tumors of prostate

    NASA Astrophysics Data System (ADS)

    Trifoniuk, L. I.; Ushenko, Yu. A.; Sidor, M. I.; Minzer, O. P.; Gritsyuk, M. V.; Novakovskaya, O. Y.

    2014-08-01

    The work consists of investigation results of diagnostic efficiency of a new azimuthally stable Mueller-matrix method of analysis of laser autofluorescence coordinate distributions of biological tissues histological sections. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of histological sections of uterus wall tumor - group 1 (dysplasia) and group 2 (adenocarcinoma) are estimated.

  11. Information transport in classical statistical systems

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-02-01

    For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.

  12. System of polarization correlometry of polycrystalline layers of urine in the differentiation stage of diabetes

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. O.; Pashkovskaya, N. V.; Marchuk, Y. F.; Dubolazov, O. V.; Savich, V. O.

    2015-08-01

    The work consists of investigation results of diagnostic efficiency of a new azimuthally stable Muellermatrix method of analysis of laser autofluorescence coordinate distributions of biological liquid layers. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of human urine polycrystalline layers for the sake of diagnosing and differentiating cholelithiasis with underlying chronic cholecystitis (group 1) and diabetes mellitus of degree II (group 2) are estimated.

  13. Unifying Complexity and Information

    NASA Astrophysics Data System (ADS)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  14. Noise and the statistical mechanics of distributed transport in a colony of interacting agents

    NASA Astrophysics Data System (ADS)

    Katifori, Eleni; Graewer, Johannes; Ronellenfitsch, Henrik; Mazza, Marco G.

    Inspired by the process of liquid food distribution between individuals in an ant colony, in this work we consider the statistical mechanics of resource dissemination between interacting agents with finite carrying capacity. The agents move inside a confined space (nest), pick up the food at the entrance of the nest and share it with other agents that they encounter. We calculate analytically and via a series of simulations the global food intake rate for the whole colony as well as observables describing how uniformly the food is distributed within the nest. Our model and predictions provide a useful benchmark to assess which strategies can lead to efficient food distribution within the nest and also to what level the observed food uptake rates and efficiency in food distribution are due to stochastic fluctuations or specific food exchange strategies by an actual ant colony.

  15. Mueller-matrix of laser-induced autofluorescence of polycrystalline films of dried peritoneal fluid in diagnostics of endometriosis

    NASA Astrophysics Data System (ADS)

    Ushenko, Yuriy A.; Koval, Galina D.; Ushenko, Alexander G.; Dubolazov, Olexander V.; Ushenko, Vladimir A.; Novakovskaia, Olga Yu.

    2016-07-01

    This research presents investigation results of the diagnostic efficiency of an azimuthally stable Mueller-matrix method of analysis of laser autofluorescence of polycrystalline films of dried uterine cavity peritoneal fluid. A model of the generalized optical anisotropy of films of dried peritoneal fluid is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase (linear and circular birefringence) and amplitude (linear and circular dichroism) anisotropies is taken into consideration. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistical analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the first to the fourth order) of differentiation of polycrystalline films of dried peritoneal fluid, group 1 (healthy donors) and group 2 (uterus endometriosis patients), are determined.

  16. Methods and means of Fourier-Stokes polarimetry and the spatial-frequency filtering of phase anisotropy manifestations in endometriosis diagnostics

    NASA Astrophysics Data System (ADS)

    Ushenko, A. G.; Dubolazov, O. V.; Ushenko, Vladimir A.; Ushenko, Yu. A.; Sakhnovskiy, M. Yu.; Prydiy, O. G.; Lakusta, I. I.; Novakovskaya, O. Yu.; Melenko, S. R.

    2016-12-01

    This research presents investigation results of diagnostic efficiency of a new azimuthally stable Mueller-matrix method of laser autofluorescence coordinate distributions analysis of dried polycrystalline films of uterine cavity peritoneal fluid. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of dried polycrystalline films of peritoneal fluid - group 1 (healthy donors) and group 2 (uterus endometriosis patients) are estimated.

  17. A statistical mechanical approach to restricted integer partition functions

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-05-01

    The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.

  18. Visual aftereffects and sensory nonlinearities from a single statistical framework

    PubMed Central

    Laparra, Valero; Malo, Jesús

    2015-01-01

    When adapted to a particular scenery our senses may fool us: colors are misinterpreted, certain spatial patterns seem to fade out, and static objects appear to move in reverse. A mere empirical description of the mechanisms tuned to color, texture, and motion may tell us where these visual illusions come from. However, such empirical models of gain control do not explain why these mechanisms work in this apparently dysfunctional manner. Current normative explanations of aftereffects based on scene statistics derive gain changes by (1) invoking decorrelation and linear manifold matching/equalization, or (2) using nonlinear divisive normalization obtained from parametric scene models. These principled approaches have different drawbacks: the first is not compatible with the known saturation nonlinearities in the sensors and it cannot fully accomplish information maximization due to its linear nature. In the second, gain change is almost determined a priori by the assumed parametric image model linked to divisive normalization. In this study we show that both the response changes that lead to aftereffects and the nonlinear behavior can be simultaneously derived from a single statistical framework: the Sequential Principal Curves Analysis (SPCA). As opposed to mechanistic models, SPCA is not intended to describe how physiological sensors work, but it is focused on explaining why they behave as they do. Nonparametric SPCA has two key advantages as a normative model of adaptation: (i) it is better than linear techniques as it is a flexible equalization that can be tuned for more sensible criteria other than plain decorrelation (either full information maximization or error minimization); and (ii) it makes no a priori functional assumption regarding the nonlinearity, so the saturations emerge directly from the scene data and the goal (and not from the assumed function). It turns out that the optimal responses derived from these more sensible criteria and SPCA are consistent with dysfunctional behaviors such as aftereffects. PMID:26528165

  19. Observers Exploit Stochastic Models of Sensory Change to Help Judge the Passage of Time

    PubMed Central

    Ahrens, Misha B.; Sahani, Maneesh

    2011-01-01

    Summary Sensory stimulation can systematically bias the perceived passage of time [1–5], but why and how this happens is mysterious. In this report, we provide evidence that such biases may ultimately derive from an innate and adaptive use of stochastically evolving dynamic stimuli to help refine estimates derived from internal timekeeping mechanisms [6–15]. A simplified statistical model based on probabilistic expectations of stimulus change derived from the second-order temporal statistics of the natural environment [16, 17] makes three predictions. First, random noise-like stimuli whose statistics violate natural expectations should induce timing bias. Second, a previously unexplored obverse of this effect is that similar noise stimuli with natural statistics should reduce the variability of timing estimates. Finally, this reduction in variability should scale with the interval being timed, so as to preserve the overall Weber law of interval timing. All three predictions are borne out experimentally. Thus, in the context of our novel theoretical framework, these results suggest that observers routinely rely on sensory input to augment their sense of the passage of time, through a process of Bayesian inference based on expectations of change in the natural environment. PMID:21256018

  20. Nondestructive evaluation of hydrogel mechanical properties using ultrasound

    PubMed Central

    Walker, Jason M.; Myers, Ashley M.; Schluchter, Mark D.; Goldberg, Victor M.; Caplan, Arnold I.; Berilla, Jim A.; Mansour, Joseph M.; Welter, Jean F.

    2012-01-01

    The feasibility of using ultrasound technology as a noninvasive, nondestructive method for evaluating the mechanical properties of engineered weight-bearing tissues was evaluated. A fixture was designed to accurately and reproducibly position the ultrasound transducer normal to the test sample surface. Agarose hydrogels were used as phantoms for cartilage to explore the feasibility of establishing correlations between ultrasound measurements and commonly used mechanical tissue assessments. The hydrogels were fabricated in 1–10% concentrations with a 2–10 mm thickness. For each concentration and thickness, six samples were created, for a total of 216 gel samples. Speed of sound was determined from the time difference between peak reflections and the known height of each sample. Modulus was computed from the speed of sound using elastic and poroelastic models. All ultrasonic measurements were made using a 15 MHz ultrasound transducer. The elastic modulus was also determined for each sample from a mechanical unconfined compression test. Analytical comparison and statistical analysis of ultrasound and mechanical testing data was carried out. A correlation between estimates of compressive modulus from ultrasonic and mechanical measurements was found, but the correlation depended on the model used to estimate the modulus from ultrasonic measurements. A stronger correlation with mechanical measurements was found using the poroelastic rather than the elastic model. Results from this preliminary testing will be used to guide further studies of native and engineered cartilage. PMID:21773854

  1. Characterizing the lung tissue mechanical properties using a micromechanical model of alveolar sac

    NASA Astrophysics Data System (ADS)

    Karami, Elham; Seify, Behzad; Moghadas, Hadi; Sabsalinejad, Masoomeh; Lee, Ting-Yim; Samani, Abbas

    2017-03-01

    According to statistics, lung disease is among the leading causes of death worldwide. As such, many research groups are developing powerful tools for understanding, diagnosis and treatment of various lung diseases. Recently, biomechanical modeling has emerged as an effective tool for better understanding of human physiology, disease diagnosis and computer assisted medical intervention. Mechanical properties of lung tissue are important requirements for methods developed for lung disease diagnosis and medical intervention. As such, the main objective of this study is to develop an effective tool for estimating the mechanical properties of normal and pathological lung parenchyma tissue based on its microstructure. For this purpose, a micromechanical model of the lung tissue was developed using finite element (FE) method, and the model was demonstrated to have application in estimating the mechanical properties of lung alveolar wall. The proposed model was developed by assembling truncated octahedron tissue units resembling the alveoli. A compression test was simulated using finite element method on the created geometry and the hyper-elastic parameters of the alveoli wall were calculated using reported alveolar wall stress-strain data and an inverse optimization framework. Preliminary results indicate that the proposed model can be potentially used to reconstruct microstructural images of lung tissue using macro-scale tissue response for normal and different pathological conditions. Such images can be used for effective diagnosis of lung diseases such as Chronic Obstructive Pulmonary Disease (COPD).

  2. Crystal plasticity assisted prediction on the yield locus evolution and forming limit curves

    NASA Astrophysics Data System (ADS)

    Lian, Junhe; Liu, Wenqi; Shen, Fuhui; Münstermann, Sebastian

    2017-10-01

    The aim of this study is to predict the plastic anisotropy evolution and its associated forming limit curves of bcc steels purely based on their microstructural features by establishing an integrated multiscale modelling approach. Crystal plasticity models are employed to describe the micro deformation mechanism and correlate the microstructure with mechanical behaviour on micro and mesoscale. Virtual laboratory is performed considering the statistical information of the microstructure, which serves as the input for the phenomenological plasticity model on the macroscale. For both scales, the microstructure evolution induced evolving features, such as the anisotropic hardening, r-value and yield locus evolution are seamlessly integrated. The predicted plasticity behaviour by the numerical simulations are compared with experiments. These evolutionary features of the material deformation behaviour are eventually considered for the prediction of formability.

  3. Traffic Flow of Interacting Self-Driven Particles: Rails and Trails, Vehicles and Vesicles

    NASA Astrophysics Data System (ADS)

    Chowdhury, Debashish

    One common feature of a vehicle, an ant and a kinesin motor is that they all convert chemical energy, derived from fuel or food, into mechanical energy required for their forward movement; such objects have been modelled in recent years as self-driven particles. Cytoskeletal filaments, e.g., microtubules, form a rail network for intra-cellular transport of vesicular cargo by molecular motors like, for example, kinesins. Similarly, ants move along trails while vehicles move along lanes. Therefore, the traffic of vehicles and organisms as well as that of molecular motors can be modelled as systems of interacting self-driven particles; these are of current interest in non-equilibrium statistical mechanics. In this paper we point out the common features of these model systems and emphasize the crucial differences in their physical properties.

  4. Characterization of ABS specimens produced via the 3D printing technology for drone structural components

    NASA Astrophysics Data System (ADS)

    Ferro, Carlo Giovanni; Brischetto, Salvatore; Torre, Roberto; Maggiore, Paolo

    2016-07-01

    The Fused Deposition Modelling (FDM) technology is widely used in rapid prototyping. 3D printers for home desktop applications are usually employed to make non-structural objects. When the mechanical stresses are not excessive, this technology can also be successfully employed to produce structural objects, not only in prototyping stage but also in the realization of series pieces. The innovative idea of the present work is the application of this technology, implemented in a desktop 3D printer, to the realization of components for aeronautical use, especially for unmanned aerial systems. For this purpose, the paper is devoted to the statistical study of the performance of a desktop 3D printer to understand how the process performs and which are the boundary limits of acceptance. Mechanical and geometrical properties of ABS (Acrylonitrile Butadiene Styrene) specimens, such as tensile strength and stiffness, have been evaluated. ASTM638 type specimens have been used. A capability analysis has been applied for both mechanical and dimensional performances. Statistically stable limits have been determined using experimentally collected data.

  5. KECSA-Movable Type Implicit Solvation Model (KMTISM)

    PubMed Central

    2015-01-01

    Computation of the solvation free energy for chemical and biological processes has long been of significant interest. The key challenges to effective solvation modeling center on the choice of potential function and configurational sampling. Herein, an energy sampling approach termed the “Movable Type” (MT) method, and a statistical energy function for solvation modeling, “Knowledge-based and Empirical Combined Scoring Algorithm” (KECSA) are developed and utilized to create an implicit solvation model: KECSA-Movable Type Implicit Solvation Model (KMTISM) suitable for the study of chemical and biological systems. KMTISM is an implicit solvation model, but the MT method performs energy sampling at the atom pairwise level. For a specific molecular system, the MT method collects energies from prebuilt databases for the requisite atom pairs at all relevant distance ranges, which by its very construction encodes all possible molecular configurations simultaneously. Unlike traditional statistical energy functions, KECSA converts structural statistical information into categorized atom pairwise interaction energies as a function of the radial distance instead of a mean force energy function. Within the implicit solvent model approximation, aqueous solvation free energies are then obtained from the NVT ensemble partition function generated by the MT method. Validation is performed against several subsets selected from the Minnesota Solvation Database v2012. Results are compared with several solvation free energy calculation methods, including a one-to-one comparison against two commonly used classical implicit solvation models: MM-GBSA and MM-PBSA. Comparison against a quantum mechanics based polarizable continuum model is also discussed (Cramer and Truhlar’s Solvation Model 12). PMID:25691832

  6. LIDT-DD: A New Self-Consistent Debris Disc Model Including Radiation Pressure and Coupling Dynamical and Collisional Evolution

    NASA Astrophysics Data System (ADS)

    Kral, Q.; Thebault, P.; Charnoz, S.

    2014-01-01

    The first attempt at developing a fully self-consistent code coupling dynamics and collisions to study debris discs (Kral et al. 2013) is presented. So far, these two crucial mechanisms were studied separately, with N-body and statistical collisional codes respectively, because of stringent computational constraints. We present a new model named LIDT-DD which is able to follow over long timescales the coupled evolution of dynamics (including radiation forces) and collisions in a self-consistent way.

  7. Knot Invariants and Cellular Automata

    DTIC Science & Technology

    1993-05-04

    behavior might be found in a naturally associated two dimensional statistical mechanics model. The Boltzmann weight TV’b := exp[-/3E(a,b, c ,d)] can be...the Yang-Baxter (star-triangle) equation [10,11]: +, b ) C I k( ) = ± )tI k I The relevance of this result to our problem is the following. The two com...J. Baxter and P. J. Forrester, "Eight-vertex SOS model and generalized Rogers- Ramanujan -type identities," J. Stat. Phys. 35 (1934) 193-266. [13] L. H

  8. A statistical mechanics approach to Granovetter theory

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Agliari, Elena

    2012-05-01

    In this paper we try to bridge breakthroughs in quantitative sociology/econometrics, pioneered during the last decades by Mac Fadden, Brock-Durlauf, Granovetter and Watts-Strogatz, by introducing a minimal model able to reproduce essentially all the features of social behavior highlighted by these authors. Our model relies on a pairwise Hamiltonian for decision-maker interactions which naturally extends the multi-populations approaches by shifting and biasing the pattern definitions of a Hopfield model of neural networks. Once introduced, the model is investigated through graph theory (to recover Granovetter and Watts-Strogatz results) and statistical mechanics (to recover Mac-Fadden and Brock-Durlauf results). Due to the internal symmetries of our model, the latter is obtained as the relaxation of a proper Markov process, allowing even to study its out-of-equilibrium properties. The method used to solve its equilibrium is an adaptation of the Hamilton-Jacobi technique recently introduced by Guerra in the spin-glass scenario and the picture obtained is the following: shifting the patterns from [-1,+1]→[0.+1] implies that the larger the amount of similarities among decision makers, the stronger their relative influence, and this is enough to explain both the different role of strong and weak ties in the social network as well as its small-world properties. As a result, imitative interaction strengths seem essentially a robust request (enough to break the gauge symmetry in the couplings), furthermore, this naturally leads to a discrete choice modelization when dealing with the external influences and to imitative behavior à la Curie-Weiss as the one introduced by Brock and Durlauf.

  9. The stability of cellulose: a statistical perspective from a coarse-grained model of hydrogen-bond networks.

    PubMed

    Shen, Tongye; Gnanakaran, S

    2009-04-22

    A critical roadblock to the production of biofuels from lignocellulosic biomass is the efficient degradation of crystalline microfibrils of cellulose to glucose. A microscopic understanding of how different physical conditions affect the overall stability of the crystalline structure of microfibrils could facilitate the design of more effective protocols for their degradation. One of the essential physical interactions that stabilizes microfibrils is a network of hydrogen (H) bonds: both intrachain H-bonds between neighboring monomers of a single cellulose polymer chain and interchain H-bonds between adjacent chains. We construct a statistical mechanical model of cellulose assembly at the resolution of explicit hydrogen-bond networks. Using the transfer matrix method, the partition function and the subsequent statistical properties are evaluated. With the help of this lattice-based model, we capture the plasticity of the H-bond network in cellulose due to frustration and redundancy in the placement of H-bonds. This plasticity is responsible for the stability of cellulose over a wide range of temperatures. Stable intrachain and interchain H-bonds are identified as a function of temperature that could possibly be manipulated toward rational destruction of crystalline cellulose.

  10. Quantifying mechanical properties in a murine fracture healing system using inverse modeling: preliminary work

    NASA Astrophysics Data System (ADS)

    Miga, Michael I.; Weis, Jared A.; Granero-Molto, Froilan; Spagnoli, Anna

    2010-03-01

    Understanding bone remodeling and mechanical property characteristics is important for assessing treatments to accelerate healing or in developing diagnostics to evaluate successful return to function. The murine system whereby mid-diaphaseal tibia fractures are imparted on the subject and fracture healing is assessed at different time points and under different therapeutic conditions is a particularly useful model to study. In this work, a novel inverse geometric nonlinear elasticity modeling framework is proposed that can reconstruct multiple mechanical properties from uniaxial testing data. To test this framework, the Lame' constants were reconstructed within the context of a murine cohort (n=6) where there were no differences in treatment post tibia fracture except that half of the mice were allowed to heal 4 days longer (10 day, and 14 day healing time point, respectively). The properties reconstructed were a shear modulus of G=511.2 +/- 295.6 kPa, and 833.3+/- 352.3 kPa for the 10 day, and 14 day time points respectively. The second Lame' constant reconstructed at λ=1002.9 +/-42.9 kPa, and 14893.7 +/- 863.3 kPa for the 10 day, and 14 day time points respectively. An unpaired Student t-test was used to test for statistically significant differences among the groups. While the shear modulus did not meet our criteria for significance, the second Lame' constant did at a value p<0.0001. Traditional metrics that are commonly used within the bone fracture healing research community were not found to be statistically significant.

  11. Investigation of Pre-Earthquake Ionospheric Disturbances by 3D Tomographic Analysis

    NASA Astrophysics Data System (ADS)

    Yagmur, M.

    2016-12-01

    Ionospheric variations before earthquakes have been widely discussed phenomena in ionospheric studies. To clarify the source and mechanism of these phenomena is highly important for earthquake forecasting. To well understanding the mechanical and physical processes of pre-seismic Ionospheric anomalies that might be related even with Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling, both statistical and 3D modeling analysis are needed. For these purpose, firstly we have investigated the relation between Ionospheric TEC Anomalies and potential source mechanisms such as space weather activity and lithospheric phenomena like positive surface electric charges. To distinguish their effects on Ionospheric TEC, we have focused on pre-seismically active days. Then, we analyzed the statistical data of 54 earthquakes that M≽6 between 2000 and 2013 as well as the 2011 Tohoku and the 2016 Kumamoto Earthquakes in Japan. By comparing TEC anomaly and Solar activity by Dst Index, we have found that 28 events that might be related with Earthquake activity. Following the statistical analysis, we also investigate the Lithospheric effect on TEC change on selected days. Among those days, we have chosen two case studies as the 2011 Tohoku and the 2016 Kumamoto Earthquakes to make 3D reconstructed images by utilizing 3D Tomography technique with Neural Networks. The results will be presented in our presentation. Keywords : Earthquake, 3D Ionospheric Tomography, Positive and Negative Anomaly, Geomagnetic Storm, Lithosphere

  12. Memory matters: influence from a cognitive map on animal space use.

    PubMed

    Gautestad, Arild O

    2011-10-21

    A vertebrate individual's cognitive map provides a capacity for site fidelity and long-distance returns to favorable patches. Fractal-geometrical analysis of individual space use based on collection of telemetry fixes makes it possible to verify the influence of a cognitive map on the spatial scatter of habitat use and also to what extent space use has been of a scale-specific versus a scale-free kind. This approach rests on a statistical mechanical level of system abstraction, where micro-scale details of behavioral interactions are coarse-grained to macro-scale observables like the fractal dimension of space use. In this manner, the magnitude of the fractal dimension becomes a proxy variable for distinguishing between main classes of habitat exploration and site fidelity, like memory-less (Markovian) Brownian motion and Levy walk and memory-enhanced space use like Multi-scaled Random Walk (MRW). In this paper previous analyses are extended by exploring MRW simulations under three scenarios: (1) central place foraging, (2) behavioral adaptation to resource depletion (avoidance of latest visited locations) and (3) transition from MRW towards Levy walk by narrowing memory capacity to a trailing time window. A generalized statistical-mechanical theory with the power to model cognitive map influence on individual space use will be important for statistical analyses of animal habitat preferences and the mechanics behind site fidelity and home ranges. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. The determinants of bond angle variability in protein/peptide backbones: A comprehensive statistical/quantum mechanics analysis.

    PubMed

    Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana

    2015-11-01

    The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.

  14. The Standard Model in the history of the Natural Sciences, Econometrics, and the social sciences

    NASA Astrophysics Data System (ADS)

    Fisher, W. P., Jr.

    2010-07-01

    In the late 18th and early 19th centuries, scientists appropriated Newton's laws of motion as a model for the conduct of any other field of investigation that would purport to be a science. This early form of a Standard Model eventually informed the basis of analogies for the mathematical expression of phenomena previously studied qualitatively, such as cohesion, affinity, heat, light, electricity, and magnetism. James Clerk Maxwell is known for his repeated use of a formalized version of this method of analogy in lectures, teaching, and the design of experiments. Economists transferring skills learned in physics made use of the Standard Model, especially after Maxwell demonstrated the value of conceiving it in abstract mathematics instead of as a concrete and literal mechanical analogy. Haavelmo's probability approach in econometrics and R. Fisher's Statistical Methods for Research Workers brought a statistical approach to bear on the Standard Model, quietly reversing the perspective of economics and the social sciences relative to that of physics. Where physicists, and Maxwell in particular, intuited scientific method as imposing stringent demands on the quality and interrelations of data, instruments, and theory in the name of inferential and comparative stability, statistical models and methods disconnected theory from data by removing the instrument as an essential component. New possibilities for reconnecting economics and the social sciences to Maxwell's sense of the method of analogy are found in Rasch's probabilistic models for measurement.

  15. Estimating preferential flow in karstic aquifers using statistical mixed models.

    PubMed

    Anaya, Angel A; Padilla, Ingrid; Macchiavelli, Raul; Vesper, Dorothy J; Meeker, John D; Alshawabkeh, Akram N

    2014-01-01

    Karst aquifers are highly productive groundwater systems often associated with conduit flow. These systems can be highly vulnerable to contamination, resulting in a high potential for contaminant exposure to humans and ecosystems. This work develops statistical models to spatially characterize flow and transport patterns in karstified limestone and determines the effect of aquifer flow rates on these patterns. A laboratory-scale Geo-HydroBed model is used to simulate flow and transport processes in a karstic limestone unit. The model consists of stainless steel tanks containing a karstified limestone block collected from a karst aquifer formation in northern Puerto Rico. Experimental work involves making a series of flow and tracer injections, while monitoring hydraulic and tracer response spatially and temporally. Statistical mixed models (SMMs) are applied to hydraulic data to determine likely pathways of preferential flow in the limestone units. The models indicate a highly heterogeneous system with dominant, flow-dependent preferential flow regions. Results indicate that regions of preferential flow tend to expand at higher groundwater flow rates, suggesting a greater volume of the system being flushed by flowing water at higher rates. Spatial and temporal distribution of tracer concentrations indicates the presence of conduit-like and diffuse flow transport in the system, supporting the notion of both combined transport mechanisms in the limestone unit. The temporal response of tracer concentrations at different locations in the model coincide with, and confirms the preferential flow distribution generated with the SMMs used in the study. © 2013, National Ground Water Association.

  16. Complex Sequencing Rules of Birdsong Can be Explained by Simple Hidden Markov Processes

    PubMed Central

    Katahira, Kentaro; Suzuki, Kenta; Okanoya, Kazuo; Okada, Masato

    2011-01-01

    Complex sequencing rules observed in birdsongs provide an opportunity to investigate the neural mechanism for generating complex sequential behaviors. To relate the findings from studying birdsongs to other sequential behaviors such as human speech and musical performance, it is crucial to characterize the statistical properties of the sequencing rules in birdsongs. However, the properties of the sequencing rules in birdsongs have not yet been fully addressed. In this study, we investigate the statistical properties of the complex birdsong of the Bengalese finch (Lonchura striata var. domestica). Based on manual-annotated syllable labeles, we first show that there are significant higher-order context dependencies in Bengalese finch songs, that is, which syllable appears next depends on more than one previous syllable. We then analyze acoustic features of the song and show that higher-order context dependencies can be explained using first-order hidden state transition dynamics with redundant hidden states. This model corresponds to hidden Markov models (HMMs), well known statistical models with a large range of application for time series modeling. The song annotation with these models with first-order hidden state dynamics agreed well with manual annotation, the score was comparable to that of a second-order HMM, and surpassed the zeroth-order model (the Gaussian mixture model; GMM), which does not use context information. Our results imply that the hierarchical representation with hidden state dynamics may underlie the neural implementation for generating complex behavioral sequences with higher-order dependencies. PMID:21915345

  17. Decompression models: review, relevance and validation capabilities.

    PubMed

    Hugon, J

    2014-01-01

    For more than a century, several types of mathematical models have been proposed to describe tissue desaturation mechanisms in order to limit decompression sickness. These models are statistically assessed by DCS cases, and, over time, have gradually included bubble formation biophysics. This paper proposes to review this evolution and discuss its limitations. This review is organized around the comparison of decompression model biophysical criteria and theoretical foundations. Then, the DCS-predictive capability was analyzed to assess whether it could be improved by combining different approaches. Most of the operational decompression models have a neo-Haldanian form. Nevertheless, bubble modeling has been gaining popularity, and the circulating bubble amount has become a major output. By merging both views, it seems possible to build a relevant global decompression model that intends to simulate bubble production while predicting DCS risks for all types of exposures and decompression profiles. A statistical approach combining both DCS and bubble detection databases has to be developed to calibrate a global decompression model. Doppler ultrasound and DCS data are essential: i. to make correlation and validation phases reliable; ii. to adjust biophysical criteria to fit at best the observed bubble kinetics; and iii. to build a relevant risk function.

  18. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    PubMed

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  19. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    ERIC Educational Resources Information Center

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  20. Essays on the statistical mechanics of the labor market and implications for the distribution of earned income

    NASA Astrophysics Data System (ADS)

    Schneider, Markus P. A.

    This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely the never married and women. The estimated parameter for never-married men's incomes is significantly different from the parameter estimated for never-married women, implying that either the combined distribution is not exponential or that the individual distributions are not exponential. However, it substantiates the existence of a persistent gender income gap among the never-married. References: Reich, M., D. M. Gordon, and R. C. Edwards (1973). A Theory of Labor Market Segmentation. Quarterly Journal of Economics 63, 359-365. Yakovenko, V. M. (2009). Econophysics, Statistical Mechanics Approach to. In R. A. Meyers (Ed.), Encyclopedia of Complexity and System Science. Springer.

Top