Sample records for statistical mechanical methods

  1. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  2. Effect of CorrelatedRotational Noise

    NASA Astrophysics Data System (ADS)

    Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna

    The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.

  3. Capture approximations beyond a statistical quantum mechanical method for atom-diatom reactions

    NASA Astrophysics Data System (ADS)

    Barrios, Lizandra; Rubayo-Soneira, Jesús; González-Lezana, Tomás

    2016-03-01

    Statistical techniques constitute useful approaches to investigate atom-diatom reactions mediated by insertion dynamics which involves complex-forming mechanisms. Different capture schemes based on energy considerations regarding the specific diatom rovibrational states are suggested to evaluate the corresponding probabilities of formation of such collision species between reactants and products in an attempt to test reliable alternatives for computationally demanding processes. These approximations are tested in combination with a statistical quantum mechanical method for the S + H2(v = 0 ,j = 1) → SH + H and Si + O2(v = 0 ,j = 1) → SiO + O reactions, where this dynamical mechanism plays a significant role, in order to probe their validity.

  4. Analysis of Longitudinal Outcome Data with Missing Values in Total Knee Arthroplasty.

    PubMed

    Kang, Yeon Gwi; Lee, Jang Taek; Kang, Jong Yeal; Kim, Ga Hye; Kim, Tae Kyun

    2016-01-01

    We sought to determine the influence of missing data on the statistical results, and to determine which statistical method is most appropriate for the analysis of longitudinal outcome data of TKA with missing values among repeated measures ANOVA, generalized estimating equation (GEE) and mixed effects model repeated measures (MMRM). Data sets with missing values were generated with different proportion of missing data, sample size and missing-data generation mechanism. Each data set was analyzed with three statistical methods. The influence of missing data was greater with higher proportion of missing data and smaller sample size. MMRM tended to show least changes in the statistics. When missing values were generated by 'missing not at random' mechanism, no statistical methods could fully avoid deviations in the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. A REVIEW OF STATISTICAL METHODS FOR THE METEOROLOGICAL ADJUSTMENT OF TROPOSPHERIC OZONE

    EPA Science Inventory

    A variety of statistical methods for meteorological adjustment of ozone have been proposed in the literature over the last decade for purposes of forecasting, estimating ozone time trends, or investigating underlying mechanisms from an empirical perspective. The methods can be...

  6. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  7. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    PubMed

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  8. Inference Control Mechanism for Statistical Database: Frequency-Imposed Data Distortions.

    ERIC Educational Resources Information Center

    Liew, Chong K.; And Others

    1985-01-01

    Introduces two data distortion methods (Frequency-Imposed Distortion, Frequency-Imposed Probability Distortion) and uses a Monte Carlo study to compare their performance with that of other distortion methods (Point Distortion, Probability Distortion). Indications that data generated by these two methods produce accurate statistics and protect…

  9. Implicit Statistical Learning and Language Skills in Bilingual Children

    ERIC Educational Resources Information Center

    Yim, Dongsun; Rudoy, John

    2013-01-01

    Purpose: Implicit statistical learning in 2 nonlinguistic domains (visual and auditory) was used to investigate (a) whether linguistic experience influences the underlying learning mechanism and (b) whether there are modality constraints in predicting implicit statistical learning with age and language skills. Method: Implicit statistical learning…

  10. Confounding in statistical mediation analysis: What it is and how to address it.

    PubMed

    Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P

    2017-11-01

    Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. The Shock and Vibration Digest. Volume 13. Number 7

    DTIC Science & Technology

    1981-07-01

    Richards, ISVR, University of Southampton Presidential Address "A Structural Dynamicist Looks at Statistical Energy Analysis " Professor B.L...excitation and for random and sine sweep mechanical excitation. Test data were used to assess prediction methods, in particular a statistical energy analysis method

  12. Mechanical Impact Testing: A Statistical Measurement

    NASA Technical Reports Server (NTRS)

    Engel, Carl D.; Herald, Stephen D.; Davis, S. Eddie

    2005-01-01

    In the decades since the 1950s, when NASA first developed mechanical impact testing of materials, researchers have continued efforts to gain a better understanding of the chemical, mechanical, and thermodynamic nature of the phenomenon. The impact mechanism is a real combustion ignition mechanism that needs understanding in the design of an oxygen system. The use of test data from this test method has been questioned due to lack of a clear method of application of the data and variability found between tests, material batches, and facilities. This effort explores a large database that has accumulated over a number of years and explores its overall nature. Moreover, testing was performed to determine the statistical nature of the test procedure to help establish sample size guidelines for material characterization. The current method of determining a pass/fail criterion based on either light emission or sound report or material charring is questioned.

  13. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  14. Reliability approach to rotating-component design. [fatigue life and stress concentration

    NASA Technical Reports Server (NTRS)

    Kececioglu, D. B.; Lalli, V. R.

    1975-01-01

    A probabilistic methodology for designing rotating mechanical components using reliability to relate stress to strength is explained. The experimental test machines and data obtained for steel to verify this methodology are described. A sample mechanical rotating component design problem is solved by comparing a deterministic design method with the new design-by reliability approach. The new method shows that a smaller size and weight can be obtained for specified rotating shaft life and reliability, and uses the statistical distortion-energy theory with statistical fatigue diagrams for optimum shaft design. Statistical methods are presented for (1) determining strength distributions for steel experimentally, (2) determining a failure theory for stress variations in a rotating shaft subjected to reversed bending and steady torque, and (3) relating strength to stress by reliability.

  15. Statistical testing of association between menstruation and migraine.

    PubMed

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  16. A REVIEW OF STATISTICAL METHODS FOR THE METEOROLOGICAL ADJUSTMENT OF TROPOSPHERIC OZONE. (R825173)

    EPA Science Inventory

    Abstract

    A variety of statistical methods for meteorological adjustment of ozone have been proposed in the literature over the last decade for purposes of forecasting, estimating ozone time trends, or investigating underlying mechanisms from an empirical perspective. T...

  17. Using Artificial Neural Networks in Educational Research: Some Comparisons with Linear Statistical Models.

    ERIC Educational Resources Information Center

    Everson, Howard T.; And Others

    This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…

  18. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  19. Granular statistical mechanics - a personal perspective

    NASA Astrophysics Data System (ADS)

    Blumenfeld, R.; Edwards, S. F.

    2014-10-01

    The science of granular matter has expanded from an activity for specialised engineering applications to a fundamental field in its own right. This has been accompanied by an explosion of research and literature, which cannot be reviewed in one paper. A key to progress in this field is the formulation of a statistical mechanical formalism that could help develop equations of state and constitutive relations. This paper aims at reviewing some milestones in this direction. An essential basic step toward the development of any static and quasi-static theory of granular matter is a systematic and useful method to quantify the grain-scale structure and we start with a review of such a method. We then review and discuss the ongoing attempt to construct a statistical mechanical theory of granular systems. Along the way, we will clarify a number of misconceptions in the field, as well as highlight several outstanding problems.

  20. A Mechanical Power Flow Capability for the Finite Element Code NASTRAN

    DTIC Science & Technology

    1989-07-01

    perimental methods. statistical energy analysis , the finite element method, and a finite element analog-,y using heat conduction equations. Experimental...weights and inertias of the transducers attached to an experimental structure may produce accuracy problems. Statistical energy analysis (SEA) is a...405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Sistems, The M.I.T. Press, (1975). 9. Mickol, J.D., and R.J. Bernhard, "An

  1. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  2. Alternative Derivations of the Statistical Mechanical Distribution Laws

    PubMed Central

    Wall, Frederick T.

    1971-01-01

    A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems. PMID:16578712

  3. Alternative derivations of the statistical mechanical distribution laws.

    PubMed

    Wall, F T

    1971-08-01

    A new approach is presented for the derivation of statistical mechanical distribution laws. The derivations are accomplished by minimizing the Helmholtz free energy under constant temperature and volume, instead of maximizing the entropy under constant energy and volume. An alternative method involves stipulating equality of chemical potential, or equality of activity, for particles in different energy levels. This approach leads to a general statement of distribution laws applicable to all systems for which thermodynamic probabilities can be written. The methods also avoid use of the calculus of variations, Lagrangian multipliers, and Stirling's approximation for the factorial. The results are applied specifically to Boltzmann, Fermi-Dirac, and Bose-Einstein statistics. The special significance of chemical potential and activity is discussed for microscopic systems.

  4. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor

    PubMed Central

    Mansour, Joseph M.; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D.; Liu, Yiying; Welter, Jean F.

    2016-01-01

    Introduction Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Methods Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. Results The statistical model generally predicted the Young's moduli in compression to within < 10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Conclusions Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor. PMID:25092421

  5. Quantum theory of multiscale coarse-graining.

    PubMed

    Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A

    2018-03-14

    Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.

  6. Bayesian approach to inverse statistical mechanics.

    PubMed

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  7. Bayesian approach to inverse statistical mechanics

    NASA Astrophysics Data System (ADS)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  8. Investigation of thermodynamic and mechanical properties of AlyIn1-yP alloys by statistical moment method

    NASA Astrophysics Data System (ADS)

    Ha, Vu Thi Thanh; Hung, Vu Van; Hanh, Pham Thi Minh; Tuyen, Nguyen Viet; Hai, Tran Thi; Hieu, Ho Khac

    2018-03-01

    The thermodynamic and mechanical properties of III-V zinc-blende AlP, InP semiconductors and their alloys have been studied in detail from statistical moment method taking into account the anharmonicity effects of the lattice vibrations. The nearest neighbor distance, thermal expansion coefficient, bulk moduli, specific heats at the constant volume and constant pressure of the zincblende AlP, InP and AlyIn1-yP alloys are calculated as functions of the temperature. The statistical moment method calculations are performed by using the many-body Stillinger-Weber potential. The concentration dependences of the thermodynamic quantities of zinc-blende AlyIn1-yP crystals have also been discussed and compared with those of the experimental results. Our results are reasonable agreement with earlier density functional theory calculations and can provide useful qualitative information for future experiments. The moment method then can be developed extensively for studying the atomistic structure and thermodynamic properties of nanoscale materials as well.

  9. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  10. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  11. The contribution of statistical physics to evolutionary biology.

    PubMed

    de Vladar, Harold P; Barton, Nicholas H

    2011-08-01

    Evolutionary biology shares many concepts with statistical physics: both deal with populations, whether of molecules or organisms, and both seek to simplify evolution in very many dimensions. Often, methodologies have undergone parallel and independent development, as with stochastic methods in population genetics. Here, we discuss aspects of population genetics that have embraced methods from physics: non-equilibrium statistical mechanics, travelling waves and Monte-Carlo methods, among others, have been used to study polygenic evolution, rates of adaptation and range expansions. These applications indicate that evolutionary biology can further benefit from interactions with other areas of statistical physics; for example, by following the distribution of paths taken by a population through time. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Mechanism-based Pharmacovigilance over the Life Sciences Linked Open Data Cloud.

    PubMed

    Kamdar, Maulik R; Musen, Mark A

    2017-01-01

    Adverse drug reactions (ADR) result in significant morbidity and mortality in patients, and a substantial proportion of these ADRs are caused by drug-drug interactions (DDIs). Pharmacovigilance methods are used to detect unanticipated DDIs and ADRs by mining Spontaneous Reporting Systems, such as the US FDA Adverse Event Reporting System (FAERS). However, these methods do not provide mechanistic explanations for the discovered drug-ADR associations in a systematic manner. In this paper, we present a systems pharmacology-based approach to perform mechanism-based pharmacovigilance. We integrate data and knowledge from four different sources using Semantic Web Technologies and Linked Data principles to generate a systems network. We present a network-based Apriori algorithm for association mining in FAERS reports. We evaluate our method against existing pharmacovigilance methods for three different validation sets. Our method has AUROC statistics of 0.7-0.8, similar to current methods, and event-specific thresholds generate AUROC statistics greater than 0.75 for certain ADRs. Finally, we discuss the benefits of using Semantic Web technologies to attain the objectives for mechanism-based pharmacovigilance.

  13. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  14. Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew

    2006-01-01

    The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described by rate constants. These problems are isomorphic with chemical kinetics problems. Recently, several efficient techniques for this purpose have been developed based on the approach originally proposed by Gillespie. Although the utility of the techniques mentioned above for Bayesian problems has not been determined, further research along these lines is warranted

  15. Guidelines for the Investigation of Mediating Variables in Business Research.

    PubMed

    MacKinnon, David P; Coxe, Stefany; Baraldi, Amanda N

    2012-03-01

    Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized.

  16. Image compression system and method having optimized quantization tables

    NASA Technical Reports Server (NTRS)

    Ratnakar, Viresh (Inventor); Livny, Miron (Inventor)

    1998-01-01

    A digital image compression preprocessor for use in a discrete cosine transform-based digital image compression device is provided. The preprocessor includes a gathering mechanism for determining discrete cosine transform statistics from input digital image data. A computing mechanism is operatively coupled to the gathering mechanism to calculate a image distortion array and a rate of image compression array based upon the discrete cosine transform statistics for each possible quantization value. A dynamic programming mechanism is operatively coupled to the computing mechanism to optimize the rate of image compression array against the image distortion array such that a rate-distortion-optimal quantization table is derived. In addition, a discrete cosine transform-based digital image compression device and a discrete cosine transform-based digital image compression and decompression system are provided. Also, a method for generating a rate-distortion-optimal quantization table, using discrete cosine transform-based digital image compression, and operating a discrete cosine transform-based digital image compression and decompression system are provided.

  17. Comparisons of non-Gaussian statistical models in DNA methylation analysis.

    PubMed

    Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-06-16

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.

  18. Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis

    PubMed Central

    Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun

    2014-01-01

    As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687

  19. The impact of loss to follow-up on hypothesis tests of the treatment effect for several statistical methods in substance abuse clinical trials.

    PubMed

    Hedden, Sarra L; Woolson, Robert F; Carter, Rickey E; Palesch, Yuko; Upadhyaya, Himanshu P; Malcolm, Robert J

    2009-07-01

    "Loss to follow-up" can be substantial in substance abuse clinical trials. When extensive losses to follow-up occur, one must cautiously analyze and interpret the findings of a research study. Aims of this project were to introduce the types of missing data mechanisms and describe several methods for analyzing data with loss to follow-up. Furthermore, a simulation study compared Type I error and power of several methods when missing data amount and mechanism varies. Methods compared were the following: Last observation carried forward (LOCF), multiple imputation (MI), modified stratified summary statistics (SSS), and mixed effects models. Results demonstrated nominal Type I error for all methods; power was high for all methods except LOCF. Mixed effect model, modified SSS, and MI are generally recommended for use; however, many methods require that the data are missing at random or missing completely at random (i.e., "ignorable"). If the missing data are presumed to be nonignorable, a sensitivity analysis is recommended.

  20. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  1. Simulating biochemical physics with computers: 1. Enzyme catalysis by phosphotriesterase and phosphodiesterase; 2. Integration-free path-integral method for quantum-statistical calculations

    NASA Astrophysics Data System (ADS)

    Wong, Kin-Yiu

    We have simulated two enzymatic reactions with molecular dynamics (MD) and combined quantum mechanical/molecular mechanical (QM/MM) techniques. One reaction is the hydrolysis of the insecticide paraoxon catalyzed by phosphotriesterase (PTE). PTE is a bioremediation candidate for environments contaminated by toxic nerve gases (e.g., sarin) or pesticides. Based on the potential of mean force (PMF) and the structural changes of the active site during the catalysis, we propose a revised reaction mechanism for PTE. Another reaction is the hydrolysis of the second-messenger cyclic adenosine 3'-5'-monophosphate (cAMP) catalyzed by phosphodiesterase (PDE). Cyclicnucleotide PDE is a vital protein in signal-transduction pathways and thus a popular target for inhibition by drugs (e.g., ViagraRTM). A two-dimensional (2-D) free-energy profile has been generated showing that the catalysis by PDE proceeds in a two-step SN2-type mechanism. Furthermore, to characterize a chemical reaction mechanism in experiment, a direct probe is measuring kinetic isotope effects (KIEs). KIEs primarily arise from internuclear quantum-statistical effects, e.g., quantum tunneling and quantization of vibration. To systematically incorporate the quantum-statistical effects during MD simulations, we have developed an automated integration-free path-integral (AIF-PI) method based on Kleinert's variational perturbation theory for the centroid density of Feynman's path integral. Using this analytic method, we have performed ab initio pathintegral calculations to study the origin of KIEs on several series of proton-transfer reactions from carboxylic acids to aryl substituted alpha-methoxystyrenes in water. In addition, we also demonstrate that the AIF-PI method can be used to systematically compute the exact value of zero-point energy (beyond the harmonic approximation) by simply minimizing the centroid effective potential.

  2. Conquering the Physics GRE

    NASA Astrophysics Data System (ADS)

    Kahn, Yoni; Anderson, Adam

    2018-03-01

    Preface; How to use this book; Resources; 1. Classical mechanics; 2. Electricity and magnetism; 3. Optics and waves; 4. Thermodynamics and statistical mechanics; 5. Quantum mechanics and atomic physics; 6. Special relativity; 7. Laboratory methods; 8. Specialized topics; 9. Special tips and tricks for the Physics GRE; Sample exams and solutions; References; Equation index; Subject index; Problems index.

  3. 10 CFR 431.17 - Determination of efficiency.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... method or methods used; the mathematical model, the engineering or statistical analysis, computer... accordance with § 431.16 of this subpart, or by application of an alternative efficiency determination method... must be: (i) Derived from a mathematical model that represents the mechanical and electrical...

  4. Towards the feasibility of using ultrasound to determine mechanical properties of tissues in a bioreactor.

    PubMed

    Mansour, Joseph M; Gu, Di-Win Marine; Chung, Chen-Yuan; Heebner, Joseph; Althans, Jake; Abdalian, Sarah; Schluchter, Mark D; Liu, Yiying; Welter, Jean F

    2014-10-01

    Our ultimate goal is to non-destructively evaluate mechanical properties of tissue-engineered (TE) cartilage using ultrasound (US). We used agarose gels as surrogates for TE cartilage. Previously, we showed that mechanical properties measured using conventional methods were related to those measured using US, which suggested a way to non-destructively predict mechanical properties of samples with known volume fractions. In this study, we sought to determine whether the mechanical properties of samples, with unknown volume fractions could be predicted by US. Aggregate moduli were calculated for hydrogels as a function of SOS, based on concentration and density using a poroelastic model. The data were used to train a statistical model, which we then used to predict volume fractions and mechanical properties of unknown samples. Young's and storage moduli were measured mechanically. The statistical model generally predicted the Young's moduli in compression to within <10% of their mechanically measured value. We defined positive linear correlations between the aggregate modulus predicted from US and both the storage and Young's moduli determined from mechanical tests. Mechanical properties of hydrogels with unknown volume fractions can be predicted successfully from US measurements. This method has the potential to predict mechanical properties of TE cartilage non-destructively in a bioreactor.

  5. Interrelationship of mechanical and corrosion-mechanical characteristics of type 12KhN4MF steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voronin, V.P.; Goncharov, A.F.; Maslov, V.A.

    1985-11-01

    Investigations presented include a comparative evaluation of the corrosionmechanical characteristics of specimens of high-strength chrome-nickelmolybdenum steel taking into consideration the different methods of melting of the original metal. A comparison of the corrosion-mechanical test results obtained with the results of acceptance tests are presented. A study of the fracture surfaces and the specimen material with the use of fractographic, macroscopic, and microscopic analyses is given. The systematization of the corrosion-mechanical test results with the use of methods of mathematical statistics are presented.

  6. Stochastical modeling for Viral Disease: Statistical Mechanics and Network Theory

    NASA Astrophysics Data System (ADS)

    Zhou, Hao; Deem, Michael

    2007-04-01

    Theoretical methods of statistical mechanics are developed and applied to study the immunological response against viral disease, such as dengue. We use this theory to show how the immune response to four different dengue serotypes may be sculpted. It is the ability of avian influenza, to change and to mix, that has given rise to the fear of a new human flu pandemic. Here we propose to utilize a scale free network based stochastic model to investigate the mitigation strategies and analyze the risk.

  7. From inverse problems to learning: a Statistical Mechanics approach

    NASA Astrophysics Data System (ADS)

    Baldassi, Carlo; Gerace, Federica; Saglietti, Luca; Zecchina, Riccardo

    2018-01-01

    We present a brief introduction to the statistical mechanics approaches for the study of inverse problems in data science. We then provide concrete new results on inferring couplings from sampled configurations in systems characterized by an extensive number of stable attractors in the low temperature regime. We also show how these result are connected to the problem of learning with realistic weak signals in computational neuroscience. Our techniques and algorithms rely on advanced mean-field methods developed in the context of disordered systems.

  8. irGPU.proton.Net: Irregular strong charge interaction networks of protonatable groups in protein molecules--a GPU solver using the fast multipole method and statistical thermodynamics.

    PubMed

    Kantardjiev, Alexander A

    2015-04-05

    A cluster of strongly interacting ionization groups in protein molecules with irregular ionization behavior is suggestive for specific structure-function relationship. However, their computational treatment is unconventional (e.g., lack of convergence in naive self-consistent iterative algorithm). The stringent evaluation requires evaluation of Boltzmann averaged statistical mechanics sums and electrostatic energy estimation for each microstate. irGPU: Irregular strong interactions in proteins--a GPU solver is novel solution to a versatile problem in protein biophysics--atypical protonation behavior of coupled groups. The computational severity of the problem is alleviated by parallelization (via GPU kernels) which is applied for the electrostatic interaction evaluation (including explicit electrostatics via the fast multipole method) as well as statistical mechanics sums (partition function) estimation. Special attention is given to the ease of the service and encapsulation of theoretical details without sacrificing rigor of computational procedures. irGPU is not just a solution-in-principle but a promising practical application with potential to entice community into deeper understanding of principles governing biomolecule mechanisms. © 2015 Wiley Periodicals, Inc.

  9. Guidelines for the Investigation of Mediating Variables in Business Research

    PubMed Central

    Coxe, Stefany; Baraldi, Amanda N.

    2013-01-01

    Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized. PMID:25237213

  10. Quantitative Skills, Critical Thinking, and Writing Mechanics in Blended versus Face-to-Face Versions of a Research Methods and Statistics Course

    ERIC Educational Resources Information Center

    Goode, Christopher T.; Lamoreaux, Marika; Atchison, Kristin J.; Jeffress, Elizabeth C.; Lynch, Heather L.; Sheehan, Elizabeth

    2018-01-01

    Hybrid or blended learning (BL) has been shown to be equivalent to or better than face-to-face (FTF) instruction in a broad variety of contexts. We randomly assigned students to either 50/50 BL or 100% FTF versions of a research methods and statistics in psychology course. Students who took the BL version of the course scored significantly lower…

  11. Balanced mechanical resonator for powder handling device

    NASA Technical Reports Server (NTRS)

    Sarrazin, Philippe C. (Inventor); Brunner, Will M. (Inventor)

    2012-01-01

    A system incorporating a balanced mechanical resonator and a method for vibration of a sample composed of granular material to generate motion of a powder sample inside the sample holder for obtaining improved analysis statistics, without imparting vibration to the sample holder support.

  12. The energetic cost of walking: a comparison of predictive methods.

    PubMed

    Kramer, Patricia Ann; Sylvester, Adam D

    2011-01-01

    The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is "best", but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species.

  13. A Lattice Boltzmann Method for Turbomachinery Simulations

    NASA Technical Reports Server (NTRS)

    Hsu, A. T.; Lopez, I.

    2003-01-01

    Lattice Boltzmann (LB) Method is a relatively new method for flow simulations. The start point of LB method is statistic mechanics and Boltzmann equation. The LB method tries to set up its model at molecular scale and simulate the flow at macroscopic scale. LBM has been applied to mostly incompressible flows and simple geometry.

  14. Generalized self-adjustment method for statistical mechanics of composite materials

    NASA Astrophysics Data System (ADS)

    Pan'kov, A. A.

    1997-03-01

    A new method is developed for the statistical mechanics of composite materials — the generalized selfadjustment method — which makes it possible to reduce the problem of predicting effective elastic properties of composites with random structures to the solution of two simpler "averaged" problems of an inclusion with transitional layers in a medium with the desired effective elastic properties. The inhomogeneous elastic properties and dimensions of the transitional layers take into account both the "approximate" order of mutual positioning, and also the variation in the dimensions and elastics properties of inclusions through appropriate special averaged indicator functions of the random structure of the composite. A numerical calculation of averaged indicator functions and effective elastic characteristics is performed by the generalized self-adjustment method for a unidirectional fiberglass on the basis of various models of actual random structures in the plane of isotropy.

  15. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  16. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  17. Uniform quantized electron gas

    NASA Astrophysics Data System (ADS)

    Høye, Johan S.; Lomba, Enrique

    2016-10-01

    In this work we study the correlation energy of the quantized electron gas of uniform density at temperature T  =  0. To do so we utilize methods from classical statistical mechanics. The basis for this is the Feynman path integral for the partition function of quantized systems. With this representation the quantum mechanical problem can be interpreted as, and is equivalent to, a classical polymer problem in four dimensions where the fourth dimension is imaginary time. Thus methods, results, and properties obtained in the statistical mechanics of classical fluids can be utilized. From this viewpoint we recover the well known RPA (random phase approximation). Then to improve it we modify the RPA by requiring the corresponding correlation function to be such that electrons with equal spins can not be on the same position. Numerical evaluations are compared with well known results of a standard parameterization of Monte Carlo correlation energies.

  18. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  19. Statistical mechanics of competitive resource allocation using agent-based models

    NASA Astrophysics Data System (ADS)

    Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.

    2015-01-01

    Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.

  20. Portfolio optimization problem with nonidentical variances of asset returns using statistical mechanical informatics.

    PubMed

    Shinzato, Takashi

    2016-12-01

    The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.

  1. Portfolio optimization problem with nonidentical variances of asset returns using statistical mechanical informatics

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2016-12-01

    The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.

  2. Fundamentals of poly(lactic acid) microstructure, crystallization behavior, and properties

    NASA Astrophysics Data System (ADS)

    Kang, Shuhui

    Poly(lactic acid) is an environmentally-benign biodegradable and sustainable thermoplastic material, which has found broad applications as food packaging films and as non-woven fibers. The crystallization and deformation mechanisms of the polymer are largely determined by the distribution of conformation and configuration. Knowledge of these mechanisms is needed to understand the mechanical and thermal properties on which processing conditions mainly depend. In conjunction with laser light scattering, Raman spectroscopy and normal coordinate analysis are used in this thesis to elucidate these properties. Vibrational spectroscopic theory, Flory's rotational isomeric state (RIS) theory, Gaussian chain statistics and statistical mechanics are used to relate experimental data to molecular chain structure. A refined RIS model is proposed, chain rigidity recalculated and chain statistics discussed. A Raman spectroscopic characterization method for crystalline and amorphous phase orientation has been developed. A shrinkage model is also proposed to interpret the dimensional stability for fibers and uni- or biaxially stretched films. A study of stereocomplexation formed by poly(l-lactic acid) and poly(d-lactic acid) is also presented.

  3. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  4. The effects of multiple repairs on Inconel 718 weld mechanical properties

    NASA Technical Reports Server (NTRS)

    Russell, C. K.; Nunes, A. C., Jr.; Moore, D.

    1991-01-01

    Inconel 718 weldments were repaired 3, 6, 9, and 13 times using the gas tungsten arc welding process. The welded panels were machined into mechanical test specimens, postweld heat treated, and nondestructively tested. Tensile properties and high cycle fatigue life were evaluated and the results compared to unrepaired weld properties. Mechanical property data were analyzed using the statistical methods of difference in means for tensile properties and difference in log means and Weibull analysis for high cycle fatigue properties. Statistical analysis performed on the data did not show a significant decrease in tensile or high cycle fatigue properties due to the repeated repairs. Some degradation was observed in all properties, however, it was minimal.

  5. Multi-fidelity machine learning models for accurate bandgap predictions of solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab

    Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less

  6. Multi-fidelity machine learning models for accurate bandgap predictions of solids

    DOE PAGES

    Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab

    2016-12-28

    Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less

  7. Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics

    NASA Astrophysics Data System (ADS)

    Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.

    2003-03-01

    Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.

  8. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  9. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  10. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  11. The Energetic Cost of Walking: A Comparison of Predictive Methods

    PubMed Central

    Kramer, Patricia Ann; Sylvester, Adam D.

    2011-01-01

    Background The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is “best”, but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. Methodology/Principal Findings We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Conclusion Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species. PMID:21731693

  12. Statistical mechanics of neocortical interactions. Derivation of short-term-memory capacity

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1984-06-01

    A theory developed by the author to describe macroscopic neocortical interactions demonstrates that empirical values of chemical and electrical parameters of synaptic interactions establish several minima of the path-integral Lagrangian as a function of excitatory and inhibitory columnar firings. The number of possible minima, their time scales of hysteresis and probable reverberations, and their nearest-neighbor columnar interactions are all consistent with well-established empirical rules of human short-term memory. Thus, aspects of conscious experience are derived from neuronal firing patterns, using modern methods of nonlinear nonequilibrium statistical mechanics to develop realistic explicit synaptic interactions.

  13. Modern Prediction Methods for Turbomachine Performance

    DTIC Science & Technology

    1976-01-01

    easier on the basis of C factor correlation. Generally, correlation has to be carefully established; statistical methods may be of good help when a , large...FLOW TURBOMACHINE BLADES Walker, G. J. Instn of Engrs,, Australia-3rd Australiasian Conference on Hydraulics 8 Fluid Mechanics-Proc, Nov 25-29 1968

  14. Nonclassical acoustics

    NASA Technical Reports Server (NTRS)

    Kentzer, C. P.

    1976-01-01

    A statistical approach to sound propagation is considered in situations where, due to the presence of large gradients of properties of the medium, the classical (deterministic) treatment of wave motion is inadequate. Mathematical methods for wave motions not restricted to small wavelengths (analogous to known methods of quantum mechanics) are used to formulate a wave theory of sound in nonuniform flows. Nonlinear transport equations for field probabilities are derived for the limiting case of noninteracting sound waves and it is postulated that such transport equations, appropriately generalized, may be used to predict the statistical behavior of sound in arbitrary flows.

  15. Statistical methods to detect novel genetic variants using publicly available GWAS summary data.

    PubMed

    Guo, Bin; Wu, Baolin

    2018-03-01

    We propose statistical methods to detect novel genetic variants using only genome-wide association studies (GWAS) summary data without access to raw genotype and phenotype data. With more and more summary data being posted for public access in the post GWAS era, the proposed methods are practically very useful to identify additional interesting genetic variants and shed lights on the underlying disease mechanism. We illustrate the utility of our proposed methods with application to GWAS meta-analysis results of fasting glucose from the international MAGIC consortium. We found several novel genome-wide significant loci that are worth further study. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. On modelling the interaction between two rotating bodies with statistically distributed features: an application to dressing of grinding wheels

    NASA Astrophysics Data System (ADS)

    Spampinato, A.; Axinte, D. A.

    2017-12-01

    The mechanisms of interaction between bodies with statistically arranged features present characteristics common to different abrasive processes, such as dressing of abrasive tools. In contrast with the current empirical approach used to estimate the results of operations based on attritive interactions, the method we present in this paper allows us to predict the output forces and the topography of a simulated grinding wheel for a set of specific operational parameters (speed ratio and radial feed-rate), providing a thorough understanding of the complex mechanisms regulating these processes. In modelling the dressing mechanisms, the abrasive characteristics of both bodies (grain size, geometry, inter-space and protrusion) are first simulated; thus, their interaction is simulated in terms of grain collisions. Exploiting a specifically designed contact/impact evaluation algorithm, the model simulates the collisional effects of the dresser abrasives on the grinding wheel topography (grain fracture/break-out). The method has been tested for the case of a diamond rotary dresser, predicting output forces within less than 10% error and obtaining experimentally validated grinding wheel topographies. The study provides a fundamental understanding of the dressing operation, enabling the improvement of its performance in an industrial scenario, while being of general interest in modelling collision-based processes involving statistically distributed elements.

  17. Evolving Learning Paradigms: Re-Setting Baselines and Collection Methods of Information and Communication Technology in Education Statistics

    ERIC Educational Resources Information Center

    Gibson, David; Broadley, Tania; Downie, Jill; Wallet, Peter

    2018-01-01

    The UNESCO Institute for Statistics (UIS) has been measuring ICT in education since 2009, but with such rapid change in technology and its use in education, it is important now to revise the collection mechanisms to focus on how technology is being used to enhance learning and teaching. Sustainable development goal (SDG) 4, for example, moves…

  18. Statistical Mechanics of Coherent Ising Machine — The Case of Ferromagnetic and Finite-Loading Hopfield Models —

    NASA Astrophysics Data System (ADS)

    Aonishi, Toru; Mimura, Kazushi; Utsunomiya, Shoko; Okada, Masato; Yamamoto, Yoshihisa

    2017-10-01

    The coherent Ising machine (CIM) has attracted attention as one of the most effective Ising computing architectures for solving large scale optimization problems because of its scalability and high-speed computational ability. However, it is difficult to implement the Ising computation in the CIM because the theories and techniques of classical thermodynamic equilibrium Ising spin systems cannot be directly applied to the CIM. This means we have to adapt these theories and techniques to the CIM. Here we focus on a ferromagnetic model and a finite loading Hopfield model, which are canonical models sharing a common mathematical structure with almost all other Ising models. We derive macroscopic equations to capture nonequilibrium phase transitions in these models. The statistical mechanical methods developed here constitute a basis for constructing evaluation methods for other Ising computation models.

  19. Continuum radiation from active galactic nuclei: A statistical study

    NASA Technical Reports Server (NTRS)

    Isobe, T.; Feigelson, E. D.; Singh, K. P.; Kembhavi, A.

    1986-01-01

    The physics of the continuum spectrum of active galactic nuclei (AGNs) was examined using a large data set and rigorous statistical methods. A data base was constructed for 469 objects which include radio selected quasars, optically selected quasars, X-ray selected AGNs, BL Lac objects, and optically unidentified compact radio sources. Each object has measurements of its radio, optical, X-ray core continuum luminosity, though many of them are upper limits. Since many radio sources have extended components, the core component were carefully selected out from the total radio luminosity. With survival analysis statistical methods, which can treat upper limits correctly, these data can yield better statistical results than those previously obtained. A variety of statistical tests are performed, such as the comparison of the luminosity functions in different subsamples, and linear regressions of luminosities in different bands. Interpretation of the results leads to the following tentative conclusions: the main emission mechanism of optically selected quasars and X-ray selected AGNs is thermal, while that of BL Lac objects is synchrotron; radio selected quasars may have two different emission mechanisms in the X-ray band; BL Lac objects appear to be special cases of the radio selected quasars; some compact radio sources show the possibility of synchrotron self-Compton (SSC) in the optical band; and the spectral index between the optical and the X-ray bands depends on the optical luminosity.

  20. A procedure for combining acoustically induced and mechanically induced loads (first passage failure design criterion)

    NASA Technical Reports Server (NTRS)

    Crowe, D. R.; Henricks, W.

    1983-01-01

    The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.

  1. Parametric Analysis to Study the Influence of Aerogel-Based Renders' Components on Thermal and Mechanical Performance.

    PubMed

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-05-04

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.

  2. Parametric Analysis to Study the Influence of Aerogel-Based Renders’ Components on Thermal and Mechanical Performance

    PubMed Central

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-01-01

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460

  3. Electrical Conductivity of Charged Particle Systems and Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Röpke, G.

    2018-01-01

    One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.

  4. A first principles calculation and statistical mechanics modeling of defects in Al-H system

    NASA Astrophysics Data System (ADS)

    Ji, Min; Wang, Cai-Zhuang; Ho, Kai-Ming

    2007-03-01

    The behavior of defects and hydrogen in Al was investigated by first principles calculations and statistical mechanics modeling. The formation energy of different defects in Al+H system such as Al vacancy, H in institution and multiple H in Al vacancy were calculated by first principles method. Defect concentration in thermodynamical equilibrium was studied by total free energy calculation including configuration entropy and defect-defect interaction from low concentration limit to hydride limit. In our grand canonical ensemble model, hydrogen chemical potential under different environment plays an important role in determing the defect concentration and properties in Al-H system.

  5. Performance evaluation of the machine learning algorithms used in inference mechanism of a medical decision support system.

    PubMed

    Bal, Mert; Amasyali, M Fatih; Sever, Hayri; Kose, Guven; Demirhan, Ayse

    2014-01-01

    The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets.

  6. Mechanical properties of silicate glasses exposed to a low-Earth orbit

    NASA Technical Reports Server (NTRS)

    Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.

    1992-01-01

    The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.

  7. Physics of Electronic Materials

    NASA Astrophysics Data System (ADS)

    Rammer, Jørgen

    2017-03-01

    1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.

  8. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  9. Development of failure model for nickel cadmium cells

    NASA Technical Reports Server (NTRS)

    Gupta, A.

    1980-01-01

    The development of a method for the life prediction of nickel cadmium cells is discussed. The approach described involves acquiring an understanding of the mechanisms of degradation and failure and at the same time developing nondestructive evaluation techniques for the nickel cadmium cells. The development of a statistical failure model which will describe the mechanisms of degradation and failure is outlined.

  10. Sources and characteristics of acoustic emissions from mechanically stressed geologic granular media — A review

    NASA Astrophysics Data System (ADS)

    Michlmayr, Gernot; Cohen, Denis; Or, Dani

    2012-05-01

    The formation of cracks and emergence of shearing planes and other modes of rapid macroscopic failure in geologic granular media involve numerous grain scale mechanical interactions often generating high frequency (kHz) elastic waves, referred to as acoustic emissions (AE). These acoustic signals have been used primarily for monitoring and characterizing fatigue and progressive failure in engineered systems, with only a few applications concerning geologic granular media reported in the literature. Similar to the monitoring of seismic events preceding an earthquake, AE may offer a means for non-invasive, in-situ, assessment of mechanical precursors associated with imminent landslides or other types of rapid mass movements (debris flows, rock falls, snow avalanches, glacier stick-slip events). Despite diverse applications and potential usefulness, a systematic description of the AE method and its relevance to mechanical processes in Earth sciences is lacking. This review is aimed at providing a sound foundation for linking observed AE with various micro-mechanical failure events in geologic granular materials, not only for monitoring of triggering events preceding mass mobilization, but also as a non-invasive tool in its own right for probing the rich spectrum of mechanical processes at scales ranging from a single grain to a hillslope. We review first studies reporting use of AE for monitoring of failure in various geologic materials, and describe AE generating source mechanisms in mechanically stressed geologic media (e.g., frictional sliding, micro-crackling, particle collisions, rupture of water bridges, etc.) including AE statistical features, such as frequency content and occurrence probabilities. We summarize available AE sensors and measurement principles. The high sampling rates of advanced AE systems enable detection of numerous discrete failure events within a volume and thus provide access to statistical descriptions of progressive collapse of systems with many interacting mechanical elements such as the fiber bundle model (FBM). We highlight intrinsic links between AE characteristics and established statistical models often used in structural engineering and material sciences, and outline potential applications for failure prediction and early-warning using the AE method in combination with the FBM. The biggest challenge to application of the AE method for field applications is strong signal attenuation. We provide an outlook for overcoming such limitations considering emergence of a class of fiber-optic based distributed AE sensors and deployment of acoustic waveguides as part of monitoring networks.

  11. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  12. Statistical mechanical estimation of the free energy of formation of E. coli biomass for use with macroscopic bioreactor balances.

    PubMed

    Grosz, R; Stephanopoulos, G

    1983-09-01

    The need for the determination of the free energy of formation of biomass in bioreactor second law balances is well established. A statistical mechanical method for the calculation of the free energy of formation of E. coli biomass is introduced. In this method, biomass is modelled to consist of a system of biopolymer networks. The partition function of this system is proposed to consist of acoustic and optical modes of vibration. Acoustic modes are described by Tarasov's model, the parameters of which are evaluated with the aid of low-temperature calorimetric data for the crystalline protein bovine chymotrypsinogen A. The optical modes are described by considering the low-temperature thermodynamic properties of biological monomer crystals such as amino acid crystals. Upper and lower bounds are placed on the entropy to establish the maximum error associated with the statistical method. The upper bound is determined by endowing the monomers in biomass with ideal gas properties. The lower bound is obtained by limiting the monomers to complete immobility. On this basis, the free energy of formation is fixed to within 10%. Proposals are made with regard to experimental verification of the calculated value and extension of the calculation to other types of biomass.

  13. Complex patterns of abnormal heartbeats

    NASA Technical Reports Server (NTRS)

    Schulte-Frohlinde, Verena; Ashkenazy, Yosef; Goldberger, Ary L.; Ivanov, Plamen Ch; Costa, Madalena; Morley-Davies, Adrian; Stanley, H. Eugene; Glass, Leon

    2002-01-01

    Individuals having frequent abnormal heartbeats interspersed with normal heartbeats may be at an increased risk of sudden cardiac death. However, mechanistic understanding of such cardiac arrhythmias is limited. We present a visual and qualitative method to display statistical properties of abnormal heartbeats. We introduce dynamical "heartprints" which reveal characteristic patterns in long clinical records encompassing approximately 10(5) heartbeats and may provide information about underlying mechanisms. We test if these dynamics can be reproduced by model simulations in which abnormal heartbeats are generated (i) randomly, (ii) at a fixed time interval following a preceding normal heartbeat, or (iii) by an independent oscillator that may or may not interact with the normal heartbeat. We compare the results of these three models and test their limitations to comprehensively simulate the statistical features of selected clinical records. This work introduces methods that can be used to test mathematical models of arrhythmogenesis and to develop a new understanding of underlying electrophysiologic mechanisms of cardiac arrhythmia.

  14. Methods and means of Fourier-Stokes polarimetry and the spatial-frequency filtering of phase anisotropy manifestations in endometriosis diagnostics

    NASA Astrophysics Data System (ADS)

    Ushenko, A. G.; Dubolazov, O. V.; Ushenko, Vladimir A.; Ushenko, Yu. A.; Sakhnovskiy, M. Yu.; Prydiy, O. G.; Lakusta, I. I.; Novakovskaya, O. Yu.; Melenko, S. R.

    2016-12-01

    This research presents investigation results of diagnostic efficiency of a new azimuthally stable Mueller-matrix method of laser autofluorescence coordinate distributions analysis of dried polycrystalline films of uterine cavity peritoneal fluid. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of dried polycrystalline films of peritoneal fluid - group 1 (healthy donors) and group 2 (uterus endometriosis patients) are estimated.

  15. Can the behavioral sciences self-correct? A social epistemic study.

    PubMed

    Romero, Felipe

    2016-12-01

    Advocates of the self-corrective thesis argue that scientific method will refute false theories and find closer approximations to the truth in the long run. I discuss a contemporary interpretation of this thesis in terms of frequentist statistics in the context of the behavioral sciences. First, I identify experimental replications and systematic aggregation of evidence (meta-analysis) as the self-corrective mechanism. Then, I present a computer simulation study of scientific communities that implement this mechanism to argue that frequentist statistics may converge upon a correct estimate or not depending on the social structure of the community that uses it. Based on this study, I argue that methodological explanations of the "replicability crisis" in psychology are limited and propose an alternative explanation in terms of biases. Finally, I conclude suggesting that scientific self-correction should be understood as an interaction effect between inference methods and social structures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  17. A practical approach for the scale-up of roller compaction process.

    PubMed

    Shi, Weixian; Sprockel, Omar L

    2016-09-01

    An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Non-equilibrium Statistical Mechanics and the Sea Ice Thickness Distribution

    NASA Astrophysics Data System (ADS)

    Wettlaufer, John; Toppaladoddi, Srikanth

    We use concepts from non-equilibrium statistical physics to transform the original evolution equation for the sea ice thickness distribution g (h) due to Thorndike et al., (1975) into a Fokker-Planck like conservation law. The steady solution is g (h) = calN (q) hqe - h / H , where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h << 1 , g (h) is controlled by both thermodynamics and mechanics, whereas for h >> 1 only mechanics controls g (h) . Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h, from which we predict the observed g (h) . This allows us to demonstrate that the ice thickness field is ergodic. The genericity of our approach provides a framework for studying the geophysical scale structure of the ice pack using methods of broad relevance in statistical mechanics. Swedish Research Council Grant No. 638-2013-9243, NASA Grant NNH13ZDA001N-CRYO and the National Science Foundation and the Office of Naval Research under OCE-1332750 for support.

  19. Statistical State Dynamics Based Study of the Role of Nonlinearity in the Maintenance of Turbulence in Couette Flow

    NASA Astrophysics Data System (ADS)

    Farrell, Brian; Ioannou, Petros; Nikolaidis, Marios-Andreas

    2017-11-01

    While linear non-normality underlies the mechanism of energy transfer from the externally driven flow to the perturbation field, nonlinearity is also known to play an essential role in sustaining turbulence. We report a study based on the statistical state dynamics of Couette flow turbulence with the goal of better understanding the role of nonlinearity in sustaining turbulence. The statistical state dynamics implementations used are ensemble closures at second order in a cumulant expansion of the Navier-Stokes equations in which the averaging operator is the streamwise mean. Two fundamentally non-normal mechanisms potentially contributing to maintaining the second cumulant are identified. These are essentially parametric perturbation growth arising from interaction of the perturbations with the fluctuating mean flow and transient growth of perturbations arising from nonlinear interaction between components of the perturbation field. By the method of selectively including these mechanisms parametric growth is found to maintain the perturbation field in the turbulent state while the more commonly invoked mechanism associated with transient growth of perturbations arising from scattering by nonlinear interaction is found to suppress perturbation variance. Funded by ERC Coturb Madrid Summer Program and NSF AGS-1246929.

  20. Statistical Mechanics of the Delayed Reward-Based Learning with Node Perturbation

    NASA Astrophysics Data System (ADS)

    Hiroshi Saito,; Kentaro Katahira,; Kazuo Okanoya,; Masato Okada,

    2010-06-01

    In reward-based learning, reward is typically given with some delay after a behavior that causes the reward. In machine learning literature, the framework of the eligibility trace has been used as one of the solutions to handle the delayed reward in reinforcement learning. In recent studies, the eligibility trace is implied to be important for difficult neuroscience problem known as the “distal reward problem”. Node perturbation is one of the stochastic gradient methods from among many kinds of reinforcement learning implementations, and it searches the approximate gradient by introducing perturbation to a network. Since the stochastic gradient method does not require a objective function differential, it is expected to be able to account for the learning mechanism of a complex system, like a brain. We study the node perturbation with the eligibility trace as a specific example of delayed reward-based learning, and analyzed it using a statistical mechanics approach. As a result, we show the optimal time constant of the eligibility trace respect to the reward delay and the existence of unlearnable parameter configurations.

  1. Accelerated battery-life testing - A concept

    NASA Technical Reports Server (NTRS)

    Mccallum, J.; Thomas, R. E.

    1971-01-01

    Test program, employing empirical, statistical and physical methods, determines service life and failure probabilities of electrochemical cells and batteries, and is applicable to testing mechanical, electrical, and chemical devices. Data obtained aids long-term performance prediction of battery or cell.

  2. Statistical mechanics of budget-constrained auctions

    NASA Astrophysics Data System (ADS)

    Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.

    2009-07-01

    Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.

  3. A novel conductivity mechanism of highly disordered carbon systems based on an investigation of graph zeta function

    NASA Astrophysics Data System (ADS)

    Matsutani, Shigeki; Sato, Iwao

    2017-09-01

    In the previous report (Matsutani and Suzuki, 2000 [21]), by proposing the mechanism under which electric conductivity is caused by the activational hopping conduction with the Wigner surmise of the level statistics, the temperature-dependent of electronic conductivity of a highly disordered carbon system was evaluated including apparent metal-insulator transition. Since the system consists of small pieces of graphite, it was assumed that the reason why the level statistics appears is due to the behavior of the quantum chaos in each granular graphite. In this article, we revise the assumption and show another origin of the Wigner surmise, which is more natural for the carbon system based on a recent investigation of graph zeta function in graph theory. Our method can be applied to the statistical treatment of the electronic properties of the randomized molecular system in general.

  4. Inferring causal relationships between phenotypes using summary statistics from genome-wide association studies.

    PubMed

    Meng, Xiang-He; Shen, Hui; Chen, Xiang-Ding; Xiao, Hong-Mei; Deng, Hong-Wen

    2018-03-01

    Genome-wide association studies (GWAS) have successfully identified numerous genetic variants associated with diverse complex phenotypes and diseases, and provided tremendous opportunities for further analyses using summary association statistics. Recently, Pickrell et al. developed a robust method for causal inference using independent putative causal SNPs. However, this method may fail to infer the causal relationship between two phenotypes when only a limited number of independent putative causal SNPs identified. Here, we extended Pickrell's method to make it more applicable for the general situations. We extended the causal inference method by replacing the putative causal SNPs with the lead SNPs (the set of the most significant SNPs in each independent locus) and tested the performance of our extended method using both simulation and empirical data. Simulations suggested that when the same number of genetic variants is used, our extended method had similar distribution of test statistic under the null model as well as comparable power under the causal model compared with the original method by Pickrell et al. But in practice, our extended method would generally be more powerful because the number of independent lead SNPs was often larger than the number of independent putative causal SNPs. And including more SNPs, on the other hand, would not cause more false positives. By applying our extended method to summary statistics from GWAS for blood metabolites and femoral neck bone mineral density (FN-BMD), we successfully identified ten blood metabolites that may causally influence FN-BMD. We extended a causal inference method for inferring putative causal relationship between two phenotypes using summary statistics from GWAS, and identified a number of potential causal metabolites for FN-BMD, which may provide novel insights into the pathophysiological mechanisms underlying osteoporosis.

  5. Distinguishing synchronous and time-varying synergies using point process interval statistics: motor primitives in frog and rat

    PubMed Central

    Hart, Corey B.; Giszter, Simon F.

    2013-01-01

    We present and apply a method that uses point process statistics to discriminate the forms of synergies in motor pattern data, prior to explicit synergy extraction. The method uses electromyogram (EMG) pulse peak timing or onset timing. Peak timing is preferable in complex patterns where pulse onsets may be overlapping. An interval statistic derived from the point processes of EMG peak timings distinguishes time-varying synergies from synchronous synergies (SS). Model data shows that the statistic is robust for most conditions. Its application to both frog hindlimb EMG and rat locomotion hindlimb EMG show data from these preparations is clearly most consistent with synchronous synergy models (p < 0.001). Additional direct tests of pulse and interval relations in frog data further bolster the support for synchronous synergy mechanisms in these data. Our method and analyses support separated control of rhythm and pattern of motor primitives, with the low level execution primitives comprising pulsed SS in both frog and rat, and both episodic and rhythmic behaviors. PMID:23675341

  6. Statistical mechanics of the vertex-cover problem

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2003-10-01

    We review recent progress in the study of the vertex-cover problem (VC). The VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits a coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping the VC to a hard-core lattice gas, and then applying techniques such as the replica trick or the cavity approach. Using these methods, the phase diagram of the VC could be obtained exactly for connectivities c < e, where the VC is replica symmetric. Recently, this result could be confirmed using traditional mathematical techniques. For c > e, the solution of the VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for the VC. Finally, we describe recent results for the VC when studied on other ensembles of finite- and infinite-dimensional graphs.

  7. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion.

    PubMed

    Gautestad, Arild O

    2012-09-07

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.

  8. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Floros, D

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less

  9. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  10. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.

    PubMed

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-10-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.

  11. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES

    PubMed Central

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-01-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512

  12. Statistical Analysis of Crystallization Database Links Protein Physico-Chemical Features with Crystallization Mechanisms

    PubMed Central

    Fusco, Diana; Barnum, Timothy J.; Bruno, Andrew E.; Luft, Joseph R.; Snell, Edward H.; Mukherjee, Sayan; Charbonneau, Patrick

    2014-01-01

    X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis. PMID:24988076

  13. Statistical analysis of crystallization database links protein physico-chemical features with crystallization mechanisms.

    PubMed

    Fusco, Diana; Barnum, Timothy J; Bruno, Andrew E; Luft, Joseph R; Snell, Edward H; Mukherjee, Sayan; Charbonneau, Patrick

    2014-01-01

    X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.

  14. Text mining by Tsallis entropy

    NASA Astrophysics Data System (ADS)

    Jamaati, Maryam; Mehri, Ali

    2018-01-01

    Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks the terms' relevance to document subject, taking advantage of their spatial correlation length. We apply this statistical concept as a new powerful word ranking metric in order to extract keywords of a single document. We carry out an experimental evaluation, which shows capability of the presented method in keyword extraction. We find that, Tsallis entropy has reliable word ranking performance, at the same level of the best previous ranking methods.

  15. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  16. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  17. Mechanical characterization of TiO{sub 2} nanofibers produced by different electrospinning techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vahtrus, Mikk; Šutka, Andris; Institute of Silicate Materials, Riga Technical University, P. Valdena 3/7, Riga LV-1048

    2015-02-15

    In this work TiO{sub 2} nanofibers produced by needle and needleless electrospinning processes from the same precursor were characterized and compared using Raman spectroscopy, transmission electron microscopy (TEM), scanning electron microscopy (SEM) and in situ SEM nanomechanical testing. Phase composition, morphology, Young's modulus and bending strength values were found. Weibull statistics was used to evaluate and compare uniformity of mechanical properties of nanofibers produced by two different methods. It is shown that both methods yield nanofibers with very similar properties. - Graphical abstract: Display Omitted - Highlights: • TiO{sub 2} nanofibers were produced by needle and needleless electrospinning processes. •more » Structure was studied by Raman spectroscopy and electron microscopy methods. • Mechanical properties were measured using advanced in situ SEM cantilevered beam bending technique. • Both methods yield nanofibers with very similar properties.« less

  18. Dynamically biased statistical model for the ortho/para conversion in the H2 + H3+ → H3+ + H2 reaction.

    PubMed

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-07

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  19. Dynamically biased statistical model for the ortho/para conversion in the H2+H3+ --> H3++ H2 reaction

    NASA Astrophysics Data System (ADS)

    Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio

    2012-09-01

    In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.

  20. Test particle propagation in magnetostatic turbulence. 2: The local approximation method

    NASA Technical Reports Server (NTRS)

    Klimas, A. J.; Sandri, G.; Scudder, J. D.; Howell, D. R.

    1976-01-01

    An approximation method for statistical mechanics is presented and applied to a class of problems which contains a test particle propagation problem. All of the available basic equations used in statistical mechanics are cast in the form of a single equation which is integrodifferential in time and which is then used as the starting point for the construction of the local approximation method. Simplification of the integrodifferential equation is achieved through approximation to the Laplace transform of its kernel. The approximation is valid near the origin in the Laplace space and is based on the assumption of small Laplace variable. No other small parameter is necessary for the construction of this approximation method. The n'th level of approximation is constructed formally, and the first five levels of approximation are calculated explicitly. It is shown that each level of approximation is governed by an inhomogeneous partial differential equation in time with time independent operator coefficients. The order in time of these partial differential equations is found to increase as n does. At n = 0 the most local first order partial differential equation which governs the Markovian limit is regained.

  1. Challenges in developing methods for quantifying the effects of weather and climate on water-associated diseases: A systematic review

    PubMed Central

    Armstrong, Ben; Fleming, Lora E.; Elson, Richard; Kovats, Sari; Vardoulakis, Sotiris; Nichols, Gordon L.

    2017-01-01

    Infectious diseases attributable to unsafe water supply, sanitation and hygiene (e.g. Cholera, Leptospirosis, Giardiasis) remain an important cause of morbidity and mortality, especially in low-income countries. Climate and weather factors are known to affect the transmission and distribution of infectious diseases and statistical and mathematical modelling are continuously developing to investigate the impact of weather and climate on water-associated diseases. There have been little critical analyses of the methodological approaches. Our objective is to review and summarize statistical and modelling methods used to investigate the effects of weather and climate on infectious diseases associated with water, in order to identify limitations and knowledge gaps in developing of new methods. We conducted a systematic review of English-language papers published from 2000 to 2015. Search terms included concepts related to water-associated diseases, weather and climate, statistical, epidemiological and modelling methods. We found 102 full text papers that met our criteria and were included in the analysis. The most commonly used methods were grouped in two clusters: process-based models (PBM) and time series and spatial epidemiology (TS-SE). In general, PBM methods were employed when the bio-physical mechanism of the pathogen under study was relatively well known (e.g. Vibrio cholerae); TS-SE tended to be used when the specific environmental mechanisms were unclear (e.g. Campylobacter). Important data and methodological challenges emerged, with implications for surveillance and control of water-associated infections. The most common limitations comprised: non-inclusion of key factors (e.g. biological mechanism, demographic heterogeneity, human behavior), reporting bias, poor data quality, and collinearity in exposures. Furthermore, the methods often did not distinguish among the multiple sources of time-lags (e.g. patient physiology, reporting bias, healthcare access) between environmental drivers/exposures and disease detection. Key areas of future research include: disentangling the complex effects of weather/climate on each exposure-health outcome pathway (e.g. person-to-person vs environment-to-person), and linking weather data to individual cases longitudinally. PMID:28604791

  2. Challenges in developing methods for quantifying the effects of weather and climate on water-associated diseases: A systematic review.

    PubMed

    Lo Iacono, Giovanni; Armstrong, Ben; Fleming, Lora E; Elson, Richard; Kovats, Sari; Vardoulakis, Sotiris; Nichols, Gordon L

    2017-06-01

    Infectious diseases attributable to unsafe water supply, sanitation and hygiene (e.g. Cholera, Leptospirosis, Giardiasis) remain an important cause of morbidity and mortality, especially in low-income countries. Climate and weather factors are known to affect the transmission and distribution of infectious diseases and statistical and mathematical modelling are continuously developing to investigate the impact of weather and climate on water-associated diseases. There have been little critical analyses of the methodological approaches. Our objective is to review and summarize statistical and modelling methods used to investigate the effects of weather and climate on infectious diseases associated with water, in order to identify limitations and knowledge gaps in developing of new methods. We conducted a systematic review of English-language papers published from 2000 to 2015. Search terms included concepts related to water-associated diseases, weather and climate, statistical, epidemiological and modelling methods. We found 102 full text papers that met our criteria and were included in the analysis. The most commonly used methods were grouped in two clusters: process-based models (PBM) and time series and spatial epidemiology (TS-SE). In general, PBM methods were employed when the bio-physical mechanism of the pathogen under study was relatively well known (e.g. Vibrio cholerae); TS-SE tended to be used when the specific environmental mechanisms were unclear (e.g. Campylobacter). Important data and methodological challenges emerged, with implications for surveillance and control of water-associated infections. The most common limitations comprised: non-inclusion of key factors (e.g. biological mechanism, demographic heterogeneity, human behavior), reporting bias, poor data quality, and collinearity in exposures. Furthermore, the methods often did not distinguish among the multiple sources of time-lags (e.g. patient physiology, reporting bias, healthcare access) between environmental drivers/exposures and disease detection. Key areas of future research include: disentangling the complex effects of weather/climate on each exposure-health outcome pathway (e.g. person-to-person vs environment-to-person), and linking weather data to individual cases longitudinally.

  3. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  4. Improved silicon carbide for advanced heat engines. I - Process development for injection molding

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas J.; Trela, Walter

    1989-01-01

    Alternate processing methods have been investigated as a means of improving the mechanical properties of injection-molded SiC. Various mixing processes (dry, high-sheer, and fluid) were evaluated along with the morphology and particle size of the starting beta-SiC powder. Statistically-designed experiments were used to determine significant effects and interactions of variables in the mixing, injection molding, and binder removal process steps. Improvements in mechanical strength can be correlated with the reduction in flaw size observed in the injection molded green bodies obtained with improved processing methods.

  5. Sharpening method of satellite thermal image based on the geographical statistical model

    NASA Astrophysics Data System (ADS)

    Qi, Pengcheng; Hu, Shixiong; Zhang, Haijun; Guo, Guangmeng

    2016-04-01

    To improve the effectiveness of thermal sharpening in mountainous regions, paying more attention to the laws of land surface energy balance, a thermal sharpening method based on the geographical statistical model (GSM) is proposed. Explanatory variables were selected from the processes of land surface energy budget and thermal infrared electromagnetic radiation transmission, then high spatial resolution (57 m) raster layers were generated for these variables through spatially simulating or using other raster data as proxies. Based on this, the local adaptation statistical relationship between brightness temperature (BT) and the explanatory variables, i.e., the GSM, was built at 1026-m resolution using the method of multivariate adaptive regression splines. Finally, the GSM was applied to the high-resolution (57-m) explanatory variables; thus, the high-resolution (57-m) BT image was obtained. This method produced a sharpening result with low error and good visual effect. The method can avoid the blind choice of explanatory variables and remove the dependence on synchronous imagery at visible and near-infrared bands. The influences of the explanatory variable combination, sampling method, and the residual error correction on sharpening results were analyzed deliberately, and their influence mechanisms are reported herein.

  6. Theory of the Sea Ice Thickness Distribution

    NASA Astrophysics Data System (ADS)

    Toppaladoddi, Srikanth; Wettlaufer, J. S.

    2015-10-01

    We use concepts from statistical physics to transform the original evolution equation for the sea ice thickness distribution g (h ) from Thorndike et al. into a Fokker-Planck-like conservation law. The steady solution is g (h )=N (q )hqe-h /H, where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h ≪1 , g (h ) is controlled by both thermodynamics and mechanics, whereas for h ≫1 only mechanics controls g (h ). Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h , from which we predict the observed g (h ). The genericity of our approach provides a framework for studying the geophysical-scale structure of the ice pack using methods of broad relevance in statistical mechanics.

  7. Theory of the Sea Ice Thickness Distribution.

    PubMed

    Toppaladoddi, Srikanth; Wettlaufer, J S

    2015-10-02

    We use concepts from statistical physics to transform the original evolution equation for the sea ice thickness distribution g(h) from Thorndike et al. into a Fokker-Planck-like conservation law. The steady solution is g(h)=N(q)h(q)e(-h/H), where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h≪1, g(h) is controlled by both thermodynamics and mechanics, whereas for h≫1 only mechanics controls g(h). Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h, from which we predict the observed g(h). The genericity of our approach provides a framework for studying the geophysical-scale structure of the ice pack using methods of broad relevance in statistical mechanics.

  8. A first-order statistical smoothing approximation for the coherent wave field in random porous random media

    NASA Astrophysics Data System (ADS)

    Müller, Tobias M.; Gurevich, Boris

    2005-04-01

    An important dissipation mechanism for waves in randomly inhomogeneous poroelastic media is the effect of wave-induced fluid flow. In the framework of Biot's theory of poroelasticity, this mechanism can be understood as scattering from fast into slow compressional waves. To describe this conversion scattering effect in poroelastic random media, the dynamic characteristics of the coherent wavefield using the theory of statistical wave propagation are analyzed. In particular, the method of statistical smoothing is applied to Biot's equations of poroelasticity. Within the accuracy of the first-order statistical smoothing an effective wave number of the coherent field, which accounts for the effect of wave-induced flow, is derived. This wave number is complex and involves an integral over the correlation function of the medium's fluctuations. It is shown that the known one-dimensional (1-D) result can be obtained as a special case of the present 3-D theory. The expression for the effective wave number allows to derive a model for elastic attenuation and dispersion due to wave-induced fluid flow. These wavefield attributes are analyzed in a companion paper. .

  9. Organizational downsizing and age discrimination litigation: the influence of personnel practices and statistical evidence on litigation outcomes.

    PubMed

    Wingate, Peter H; Thornton, George C; McIntyre, Kelly S; Frame, Jennifer H

    2003-02-01

    The present study examined relationships between reduction-in-force (RIF) personnel practices, presentation of statistical evidence, and litigation outcomes. Policy capturing methods were utilized to analyze the components of 115 federal district court opinions involving age discrimination disparate treatment allegations and organizational downsizing. Univariate analyses revealed meaningful links between RIF personnel practices, use of statistical evidence, and judicial verdict. The defendant organization was awarded summary judgment in 73% of the claims included in the study. Judicial decisions in favor of the defendant organization were found to be significantly related to such variables as formal performance appraisal systems, termination decision review within the organization, methods of employee assessment and selection for termination, and the presence of a concrete layoff policy. The use of statistical evidence in ADEA disparate treatment litigation was investigated and found to be a potentially persuasive type of indirect evidence. Legal, personnel, and evidentiary ramifications are reviewed, and a framework of downsizing mechanics emphasizing legal defensibility is presented.

  10. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  11. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  12. A study of environmental characterization of conventional and advanced aluminum alloys for selection and design. Phase 2: The breaking load test method

    NASA Technical Reports Server (NTRS)

    Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.

    1984-01-01

    A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.

  13. Markov Logic Networks in the Analysis of Genetic Data

    PubMed Central

    Sakhanenko, Nikita A.

    2010-01-01

    Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249

  14. Tsallis non-extensive statistical mechanics in the ionospheric detrended total electron content during quiet and storm periods

    NASA Astrophysics Data System (ADS)

    Ogunsua, B. O.; Laoye, J. A.

    2018-05-01

    In this paper, the Tsallis non-extensive q-statistics in ionospheric dynamics was investigated using the total electron content (TEC) obtained from two Global Positioning System (GPS) receiver stations. This investigation was carried out considering the geomagnetically quiet and storm periods. The micro density variation of the ionospheric total electron content was extracted from the TEC data by method of detrending. The detrended total electron content, which represent the variation in the internal dynamics of the system was further analyzed using for non-extensive statistical mechanics using the q-Gaussian methods. Our results reveals that for all the analyzed data sets the Tsallis Gaussian probability distribution (q-Gaussian) with value q > 1 were obtained. It was observed that there is no distinct difference in pattern between the values of qquiet and qstorm. However the values of q varies with geophysical conditions and possibly with local dynamics for the two stations. Also observed are the asymmetric pattern of the q-Gaussian and a highly significant level of correlation for the q-index values obtained for the storm periods compared to the quiet periods between the two GPS receiver stations where the TEC was measured. The factors responsible for this variation can be mostly attributed to the varying mechanisms resulting in the self-reorganization of the system dynamics during the storm periods. The result shows the existence of long range correlation for both quiet and storm periods for the two stations.

  15. Estimation of trends

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.

  16. Statistical Mechanics of Combinatorial Auctions

    NASA Astrophysics Data System (ADS)

    Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo

    2006-09-01

    Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.

  17. Valid statistical inference methods for a case-control study with missing data.

    PubMed

    Tian, Guo-Liang; Zhang, Chi; Jiang, Xuejun

    2018-04-01

    The main objective of this paper is to derive the valid sampling distribution of the observed counts in a case-control study with missing data under the assumption of missing at random by employing the conditional sampling method and the mechanism augmentation method. The proposed sampling distribution, called the case-control sampling distribution, can be used to calculate the standard errors of the maximum likelihood estimates of parameters via the Fisher information matrix and to generate independent samples for constructing small-sample bootstrap confidence intervals. Theoretical comparisons of the new case-control sampling distribution with two existing sampling distributions exhibit a large difference. Simulations are conducted to investigate the influence of the three different sampling distributions on statistical inferences. One finding is that the conclusion by the Wald test for testing independency under the two existing sampling distributions could be completely different (even contradictory) from the Wald test for testing the equality of the success probabilities in control/case groups under the proposed distribution. A real cervical cancer data set is used to illustrate the proposed statistical methods.

  18. Electric Conductivity in a Beam, Plasma System.

    DTIC Science & Technology

    1977-09-15

    Green ’s function solution to the Boltzmann equation and arrived at a stationary state. However Balescu has accounted for the potential energy of...R. Balescu , Statistical Mechanics of Charged Particles , (In terscience Publishers , New York , 1963) 21. P.M. Morse and H. Feshbach, Methods of

  19. Treatment of Chemical Equilibrium without Using Thermodynamics or Statistical Mechanics.

    ERIC Educational Resources Information Center

    Nelson, P. G.

    1986-01-01

    Discusses the conventional approaches to teaching about chemical equilibrium in advanced physical chemistry courses. Presents an alternative approach to the treatment of this concept by using Boltzmann's distribution law. Lists five advantages to using this method as compared with the other approaches. (TW)

  20. A primer of statistical methods for correlating parameters and properties of electrospun poly(L-lactide) scaffolds for tissue engineering--PART 1: design of experiments.

    PubMed

    Seyedmahmoud, Rasoul; Rainer, Alberto; Mozetic, Pamela; Maria Giannitelli, Sara; Trombetta, Marcella; Traversa, Enrico; Licoccia, Silvia; Rinaldi, Antonio

    2015-01-01

    Tissue engineering scaffolds produced by electrospinning are of enormous interest, but still lack a true understanding about the fundamental connection between the outstanding functional properties, the architecture, the mechanical properties, and the process parameters. Fragmentary results from several parametric studies only render some partial insights that are hard to compare and generally miss the role of parameters interactions. To bridge this gap, this article (Part-1 of 2) features a case study on poly-L-lactide scaffolds to demonstrate how statistical methods such as design of experiments can quantitatively identify the correlations existing between key scaffold properties and control parameters, in a systematic, consistent, and comprehensive manner disentangling main effects from interactions. The morphological properties (i.e., fiber distribution and porosity) and mechanical properties (Young's modulus) are "charted" as a function of molecular weight (MW) and other electrospinning process parameters (the Xs), considering the single effect as well as interactions between Xs. For the first time, the major role of the MW emerges clearly in controlling all scaffold properties. The correlation between mechanical and morphological properties is also addressed. © 2014 Wiley Periodicals, Inc.

  1. Mechanisms of post-supply contamination of drinking water in Bagamoyo, Tanzania.

    PubMed

    Harris, Angela R; Davis, Jennifer; Boehm, Alexandria B

    2013-09-01

    Access to household water connections remains low in sub-Saharan Africa, representing a public health concern. Previous studies have shown water stored in the home to be more contaminated than water at the source; however, the mechanisms of post-supply contamination remain unclear. Using water quality measurements and structured observations of households in Bagamoyo, Tanzania, this study elucidates the causal mechanisms of the microbial contamination of drinking water after collection from a communal water source. The study identifies statistically significant loadings of fecal indicator bacteria (FIB) occurring immediately after filling the storage container at the source and after extraction of the water from the container in the home. Statistically significant loadings of FIB also occur with various water extraction methods, including decanting from the container and use of a cup or ladle. Additionally, pathogenic genes of Escherichia coli were detected in stored drinking water but not in the source from which it was collected, highlighting the potential health risks of post-supply contamination. The results of the study confirm that storage containers and extraction utensils introduce microbial contamination into stored drinking water, and suggest that further research is needed to identify methods of water extraction that prevent microbial contamination of drinking water.

  2. Cylindrical dust acoustic solitary waves with transverse perturbations in quantum dusty plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mushtaq, A.

    2007-11-15

    The nonlinear quantum dust acoustic waves with effects of nonplanar cylindrical geometry, quantum corrections, and transverse perturbations are studied. By using the perturbation method, a cylindrical Kadomtsev-Petviashvili equation for dust acoustic waves is derived by incorporating quantum-mechanical effects. The quantum-mechanical effects via quantum diffraction and quantum statistics, and the role of transverse perturbations in cylindrical geometry on the dynamics of this wave, are studied both analytically and numerically.

  3. Sex differences in mechanical allodynia: how can it be preclinically quantified and analyzed?

    PubMed Central

    Nicotra, Lauren; Tuke, Jonathan; Grace, Peter M.; Rolan, Paul E.; Hutchinson, Mark R.

    2014-01-01

    Translating promising preclinical drug discoveries to successful clinical trials remains a significant hurdle in pain research. Although animal models have significantly contributed to understanding chronic pain pathophysiology, the majority of research has focused on male rodents using testing procedures that produce sex difference data that do not align well with comparable clinical experiences. Additionally, the use of animal pain models presents ongoing ethical challenges demanding continuing refinement of preclinical methods. To this end, this study sought to test a quantitative allodynia assessment technique and associated statistical analysis in a modified graded nerve injury pain model with the aim to further examine sex differences in allodynia. Graded allodynia was established in male and female Sprague Dawley rats by altering the number of sutures placed around the sciatic nerve and quantified by the von Frey test. Linear mixed effects modeling regressed response on each fixed effect (sex, oestrus cycle, pain treatment). On comparison with other common von Frey assessment techniques, utilizing lower threshold filaments than those ordinarily tested, at 1 s intervals, appropriately and successfully investigated female mechanical allodynia, revealing significant sex and oestrus cycle difference across the graded allodynia that other common behavioral methods were unable to detect. Utilizing this different von Frey approach and graded allodynia model, a single suture inflicting less allodynia was sufficient to demonstrate exaggerated female mechanical allodynia throughout the phases of dioestrus and pro-oestrus. Refining the von Frey testing method, statistical analysis technique and the use of a graded model of chronic pain, allowed for examination of the influences on female mechanical nociception that other von Frey methods cannot provide. PMID:24592221

  4. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  5. Determining significant material properties: A discovery approach

    NASA Technical Reports Server (NTRS)

    Karplus, Alan K.

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. The experiment itself can be informative for persons of any age past elementary school, and even for some in elementary school. The preparation of the plastic samples is readily accomplished by persons with resonable dexterity in the cutting of paper designs. The completion of the statistical Design of Experiments, which uses Yates' Method, requires basic math (addition and subtraction). Interpretive work requires plotting of data and making observations. Knowledge of statistical methods would be helpful. The purpose of this experiment is to acquaint students with the seven classes of recyclable plastics, and provide hands-on learning about the response of these plastics to mechanical tensile loading.

  6. Association analysis of multiple traits by an approach of combining P values.

    PubMed

    Chen, Lili; Wang, Yong; Zhou, Yajing

    2018-03-01

    Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.

  7. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes.

    PubMed

    Cafaro, Carlo; Alsing, Paul M

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  8. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes

    NASA Astrophysics Data System (ADS)

    Cafaro, Carlo; Alsing, Paul M.

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  9. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere

    PubMed Central

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375

  10. Brownian motion or Lévy walk? Stepping towards an extended statistical mechanics for animal locomotion

    PubMed Central

    Gautestad, Arild O.

    2012-01-01

    Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the ‘power law in disguise’ paradox—from a composite Brownian motion consisting of a superposition of independent movement processes at different scales—may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated. PMID:22456456

  11. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere.

    PubMed

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.

  12. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  13. A statistical mechanics model for free-for-all airplane passenger boarding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steffen, Jason H.; /Fermilab

    2008-08-01

    I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. Themore » model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty.« less

  14. In vivo serial MRI-based models and statistical methods to quantify sensitivity and specificity of mechanical predictors for carotid plaque rupture: location and beyond.

    PubMed

    Wu, Zheyang; Yang, Chun; Tang, Dalin

    2011-06-01

    It has been hypothesized that mechanical risk factors may be used to predict future atherosclerotic plaque rupture. Truly predictive methods for plaque rupture and methods to identify the best predictor(s) from all the candidates are lacking in the literature. A novel combination of computational and statistical models based on serial magnetic resonance imaging (MRI) was introduced to quantify sensitivity and specificity of mechanical predictors to identify the best candidate for plaque rupture site prediction. Serial in vivo MRI data of carotid plaque from one patient was acquired with follow-up scan showing ulceration. 3D computational fluid-structure interaction (FSI) models using both baseline and follow-up data were constructed and plaque wall stress (PWS) and strain (PWSn) and flow maximum shear stress (FSS) were extracted from all 600 matched nodal points (100 points per matched slice, baseline matching follow-up) on the lumen surface for analysis. Each of the 600 points was marked "ulcer" or "nonulcer" using follow-up scan. Predictive statistical models for each of the seven combinations of PWS, PWSn, and FSS were trained using the follow-up data and applied to the baseline data to assess their sensitivity and specificity using the 600 data points for ulcer predictions. Sensitivity of prediction is defined as the proportion of the true positive outcomes that are predicted to be positive. Specificity of prediction is defined as the proportion of the true negative outcomes that are correctly predicted to be negative. Using probability 0.3 as a threshold to infer ulcer occurrence at the prediction stage, the combination of PWS and PWSn provided the best predictive accuracy with (sensitivity, specificity) = (0.97, 0.958). Sensitivity and specificity given by PWS, PWSn, and FSS individually were (0.788, 0.968), (0.515, 0.968), and (0.758, 0.928), respectively. The proposed computational-statistical process provides a novel method and a framework to assess the sensitivity and specificity of various risk indicators and offers the potential to identify the optimized predictor for plaque rupture using serial MRI with follow-up scan showing ulceration as the gold standard for method validation. While serial MRI data with actual rupture are hard to acquire, this single-case study suggests that combination of multiple predictors may provide potential improvement to existing plaque assessment schemes. With large-scale patient studies, this predictive modeling process may provide more solid ground for rupture predictor selection strategies and methods for image-based plaque vulnerability assessment.

  15. Effects of different mechanized soil fertilization methods on corn soil fertility under continuous cropping

    NASA Astrophysics Data System (ADS)

    Shi, Qingwen; Wang, Huixin; Bai, Chunming; Wu, Di; Song, Qiaobo; Gao, Depeng; Dong, Zengqi; Cheng, Xin; Dong, Qiping; Zhang, Yahao; Mu, Jiahui; Chen, Qinghong; Liao, Wenqing; Qu, Tianru; Zhang, Chunling; Zhang, Xinyu; Liu, Yifei; Han, Xiaori

    2017-05-01

    Experiments for mechanized soil fertilization for corns were conducted in Faku demonstration zone. On this basis, we studied effects on corn soil fertility under continuous cropping due to different mechanized soil fertilization methods. Our study would serve as a theoretical basis further for mechanized soil fertilization improvement and soil quality improvement in brown soil area. Based on the survey of soil physical characteristics during different corn growth periods, we collected soil samples from different corn growth periods to determine and make statistical analysis accordingly. Stalk returning to field with deep tillage proved to be the most effective on available nutrient improvement for arable soil in the demonstration zone. Different mechanized soil fertilization methods were remarkably effective on total phosphorus improvement for arable soil in the demonstration zone, while less effective on total nitrogen or total potassium, and not so effective on C/N ratio in soil. Stalk returning with deep tillage was more favorable to improve content of organic matter in soil, when compared with surface application, and organic granular fertilizer more favorable when compared with decomposed cow dung for such a purpose, too.

  16. Coupling functions: Universal insights into dynamical interaction mechanisms

    NASA Astrophysics Data System (ADS)

    Stankovski, Tomislav; Pereira, Tiago; McClintock, Peter V. E.; Stefanovska, Aneta

    2017-10-01

    The dynamical systems found in nature are rarely isolated. Instead they interact and influence each other. The coupling functions that connect them contain detailed information about the functional mechanisms underlying the interactions and prescribe the physical rule specifying how an interaction occurs. A coherent and comprehensive review is presented encompassing the rapid progress made recently in the analysis, understanding, and applications of coupling functions. The basic concepts and characteristics of coupling functions are presented through demonstrative examples of different domains, revealing the mechanisms and emphasizing their multivariate nature. The theory of coupling functions is discussed through gradually increasing complexity from strong and weak interactions to globally coupled systems and networks. A variety of methods that have been developed for the detection and reconstruction of coupling functions from measured data is described. These methods are based on different statistical techniques for dynamical inference. Stemming from physics, such methods are being applied in diverse areas of science and technology, including chemistry, biology, physiology, neuroscience, social sciences, mechanics, and secure communications. This breadth of application illustrates the universality of coupling functions for studying the interaction mechanisms of coupled dynamical systems.

  17. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  18. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  19. Estimation of body mass index from the metrics of the first metatarsal

    NASA Astrophysics Data System (ADS)

    Dunn, Tyler E.

    Estimation of the biological profile from as many skeletal elements as possible is a necessity in both forensic and bioarchaeological contexts; this includes non-standard aspects of the biological profile, such as body mass index (BMI). BMI is a measure that allows for understanding of the composition of an individual and is traditionally divided into four groups: underweight, normal weight, overweight, and obese. BMI estimation incorporates both estimation of stature and body mass. The estimation of stature from skeletal elements is commonly included into the standard biological profile but the estimation of body mass needs to be further statistically validated to be consistently included. The bones of the foot, specifically the first metatarsal, may have the ability to estimate BMI given an allometric relationship to stature and the mechanical relationship to body mass. There are two commonly used methods for stature estimation, the anatomical method and the regression method. The anatomical method takes into account all of the skeletal elements that contribute to stature while the regression method relies on the allometric relationship between a skeletal element and living stature. A correlation between the metrics of the first metatarsal and living stature has been observed, and proposed as a method for valid stature estimation from the boney foot (Byers et al., 1989). Body mass estimation from skeletal elements relies on two theoretical frameworks: the morphometric and the mechanical approaches. The morphometric approach relies on the size relationship of the individual to body mass; the basic relationship between volume, density, and weight allows for body mass estimation. The body is thought of as a cylinder, and in order to understand the volume of this cylinder the diameter is needed. A commonly used proxy for this in the human body is skeletal bi-iliac breadth from rearticulated pelvic girdle. The mechanical method of body mass estimation relies on the ideas of biomechanical bone remodeling; the elements of the skeleton that are under higher forces, including weight, will remodel to minimize stress. A commonly used metric for the mechanical method of body mass estimation is the diameter of the head of the femur. The foot experiences nearly the entire weight force of the individual at any point in the gait cycle and is subject to the biomechanical remodeling that this force would induce. Therefore, the application of the mechanical framework for body mass estimation could stand true for the elements of the foot. The morphometric and mechanical approaches have been validated against one another on a large, geographically disparate population (Auerbach and Ruff, 2004), but have yet to be validated on a sample of known body mass. DeGroote and Humphrey (2011) test the ability of the first metatarsal to estimate femoral head diameter, body mass, and femoral length. The estimated femoral head diameter from the first metatarsal is used to estimate body mass via the morphometric approach and the femoral length is used to estimate living stature. The authors find that body mass and stature estimation methods from more commonly used skeletal elements compared well with the methods developed from the first metatarsal. This study examines 388 `White' individuals from the William M. Bass donated skeletal collection to test the reliability of the body mass estimates from femoral head diameter and bi-iliac breadth, stature from maximum femoral length, and body mass and stature from the metrics of the first metatarsal. This sample included individuals from all four of the BMI classes. This study finds that all of the skeletal indicators compare well with one another; there is no statistical difference in the stature estimates from the first metatarsal and the maximum length of the femur, and there is no statistical between all three of the body mass estimation methods. When compared to the forensic estimates of stature neither of the tested methods had statistical difference. Conversely, when the body mass estimates are compared to forensic body mass there was a statistical difference and when further investigated the most difference in the body mass estimates was in the extremes of body mass (the underweight and obese categories). These findings indicate that the estimation of stature from both the maximum femoral length and the metrics of the metatarsal are accurate methods. Furthermore, the estimation of body mass is accurate when the individual is in the middle range of the BMI spectrum while these methods for outlying individuals are inaccurate. These findings have implications for the application of stature and body mass estimation in the fields of bioarchaeology, forensic anthropology, and paleoanthropology.

  20. Structural and thermomechanical properties of the zinc-blende AlX (X = P, As, Sb) compounds

    NASA Astrophysics Data System (ADS)

    Ha, Vu Thi Thanh; Hung, Vu Van; Hanh, Pham Thi Minh; Nguyen, Viet Tuyen; Hieu, Ho Khac

    2017-08-01

    The structural and thermomechanical properties of zinc-blende aluminum class of III-V compounds have been studied based on the statistical moment method (SMM) in quantum statistical mechanics. Within the SMM scheme, we derived the analytical expressions of the nearest-neighbor distance, thermal expansion coefficient, atomic mean-square displacement and elastic moduli (Young’s modulus, bulk modulus and shear modulus). Numerical calculations have been performed for zinc-blende AlX (X = As, P, Sb) at ambient conditions up to the temperature of 1000 K. Our results are in good and reasonable agreements with earlier measurements and can provide useful references for future experimental and theoretical works. This research presents a systematic approach to investigate the thermodynamic and mechanical properties of materials.

  1. Protein Condensation

    NASA Astrophysics Data System (ADS)

    Gunton, James D.; Shiryayev, Andrey; Pagan, Daniel L.

    2007-09-01

    Preface; 1. Introduction; 2. Globular protein structure; 3. Experimental methods; 4. Thermodynamics and statistical mechanics; 5. Protein-protein interactions; 6. Theoretical studies of equilibrium; 7. Nucleation theory; 8. Experimental studies of nucleation; 9. Lysozyme; 10. Some other globular proteins; 11. Membrane proteins; 12. Crystallins and cataracts; 13. Sickle hemoglobin and sickle cell anemia; 14, Alzheimer's disease; Index.

  2. Protein Condensation

    NASA Astrophysics Data System (ADS)

    Gunton, James D.; Shiryayev, Andrey; Pagan, Daniel L.

    2014-07-01

    Preface; 1. Introduction; 2. Globular protein structure; 3. Experimental methods; 4. Thermodynamics and statistical mechanics; 5. Protein-protein interactions; 6. Theoretical studies of equilibrium; 7. Nucleation theory; 8. Experimental studies of nucleation; 9. Lysozyme; 10. Some other globular proteins; 11. Membrane proteins; 12. Crystallins and cataracts; 13. Sickle hemoglobin and sickle cell anemia; 14, Alzheimer's disease; Index.

  3. Experimental Analysis of Cell Function Using Cytoplasmic Streaming

    ERIC Educational Resources Information Center

    Janssens, Peter; Waldhuber, Megan

    2012-01-01

    This laboratory exercise investigates the phenomenon of cytoplasmic streaming in the fresh water alga "Nitella". Students use the fungal toxin cytochalasin D, an inhibitor of actin polymerization, to investigate the mechanism of streaming. Students use simple statistical methods to analyze their data. Typical student data are provided. (Contains 3…

  4. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  5. graph-GPA: A graphical model for prioritizing GWAS results and investigating pleiotropic architecture.

    PubMed

    Chung, Dongjun; Kim, Hang J; Zhao, Hongyu

    2017-02-01

    Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.

  6. Many-Body Localization and Thermalization in Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Nandkishore, Rahul; Huse, David A.

    2015-03-01

    We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.

  7. Do the methods used to analyse missing data really matter? An examination of data from an observational study of Intermediate Care patients.

    PubMed

    Kaambwa, Billingsley; Bryan, Stirling; Billingham, Lucinda

    2012-06-27

    Missing data is a common statistical problem in healthcare datasets from populations of older people. Some argue that arbitrarily assuming the mechanism responsible for the missingness and therefore the method for dealing with this missingness is not the best option-but is this always true? This paper explores what happens when extra information that suggests that a particular mechanism is responsible for missing data is disregarded and methods for dealing with the missing data are chosen arbitrarily. Regression models based on 2,533 intermediate care (IC) patients from the largest evaluation of IC done and published in the UK to date were used to explain variation in costs, EQ-5D and Barthel index. Three methods for dealing with missingness were utilised, each assuming a different mechanism as being responsible for the missing data: complete case analysis (assuming missing completely at random-MCAR), multiple imputation (assuming missing at random-MAR) and Heckman selection model (assuming missing not at random-MNAR). Differences in results were gauged by examining the signs of coefficients as well as the sizes of both coefficients and associated standard errors. Extra information strongly suggested that missing cost data were MCAR. The results show that MCAR and MAR-based methods yielded similar results with sizes of most coefficients and standard errors differing by less than 3.4% while those based on MNAR-methods were statistically different (up to 730% bigger). Significant variables in all regression models also had the same direction of influence on costs. All three mechanisms of missingness were shown to be potential causes of the missing EQ-5D and Barthel data. The method chosen to deal with missing data did not seem to have any significant effect on the results for these data as they led to broadly similar conclusions with sizes of coefficients and standard errors differing by less than 54% and 322%, respectively. Arbitrary selection of methods to deal with missing data should be avoided. Using extra information gathered during the data collection exercise about the cause of missingness to guide this selection would be more appropriate.

  8. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  9. Quantitative naturalistic methods for detecting change points in psychotherapy research: an illustration with alliance ruptures.

    PubMed

    Eubanks-Carter, Catherine; Gorman, Bernard S; Muran, J Christopher

    2012-01-01

    Analysis of change points in psychotherapy process could increase our understanding of mechanisms of change. In particular, naturalistic change point detection methods that identify turning points or breakpoints in time series data could enhance our ability to identify and study alliance ruptures and resolutions. This paper presents four categories of statistical methods for detecting change points in psychotherapy process: criterion-based methods, control chart methods, partitioning methods, and regression methods. Each method's utility for identifying shifts in the alliance is illustrated using a case example from the Beth Israel Psychotherapy Research program. Advantages and disadvantages of the various methods are discussed.

  10. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  11. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.

    PubMed

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-19

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  12. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    PubMed Central

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  13. Non-Markovian generalization of the Lindblad theory of open quantum systems

    NASA Astrophysics Data System (ADS)

    Breuer, Heinz-Peter

    2007-02-01

    A systematic approach to the non-Markovian quantum dynamics of open systems is given by the projection operator techniques of nonequilibrium statistical mechanics. Combining these methods with concepts from quantum information theory and from the theory of positive maps, we derive a class of correlated projection superoperators that take into account in an efficient way statistical correlations between the open system and its environment. The result is used to develop a generalization of the Lindblad theory to the regime of highly non-Markovian quantum processes in structured environments.

  14. Free energy surfaces from nonequilibrium processes without work measurement

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.

    2006-04-01

    Recent developments in statistical mechanics have allowed the estimation of equilibrium free energies from the statistics of work measurements during processes that drive the system out of equilibrium. Here a different class of processes is considered, wherein the system is prepared and released from a nonequilibrium state, and no external work is involved during its observation. For such "clamp-and-release" processes, a simple strategy for the estimation of equilibrium free energies is offered. The method is illustrated with numerical simulations and analyzed in the context of tethered single-molecule experiments.

  15. Variational method for nonconservative field theories: Formulation and two PT-symmetric case examples

    NASA Astrophysics Data System (ADS)

    Restrepo, Juan; Ciuti, Cristiano; Favero, Ivan

    2014-01-01

    This Letter investigates a hybrid quantum system combining cavity quantum electrodynamics and optomechanics. The Hamiltonian problem of a photon mode coupled to a two-level atom via a Jaynes-Cummings coupling and to a mechanical mode via radiation pressure coupling is solved analytically. The atom-cavity polariton number operator commutes with the total Hamiltonian leading to an exact description in terms of tripartite atom-cavity-mechanics polarons. We demonstrate the possibility to obtain cooling of mechanical motion at the single-polariton level and describe the peculiar quantum statistics of phonons in such an unconventional regime.

  16. Pharmaceutical counselling about different types of tablet-splitting methods based on the results of weighing tests and mechanical development of splitting devices.

    PubMed

    Somogyi, O; Meskó, A; Csorba, L; Szabó, P; Zelkó, R

    2017-08-30

    The division of tablets and adequate methods of splitting them are a complex problem in all sectors of health care. Although tablet-splitting is often required, this procedure can be difficult for patients. Four tablets were investigated with different external features (shape, score-line, film-coat and size). The influencing effect of these features and the splitting methods was investigated according to the precision and "weight loss" of splitting techniques. All four types of tablets were halved by four methods: by hand, with a kitchen knife, with an original manufactured splitting device and with a modified tablet splitter based on a self-developed mechanical model. The mechanical parameters (harness and friability) of the products were measured during the study. The "weight loss" and precision of splitting methods were determined and compared by statistical analysis. On the basis of the results, the external features (geometry), the mechanical parameters of tablets and the mechanical structure of splitting devices can influence the "weight loss" and precision of tablet-splitting. Accordingly, a new decision-making scheme was developed for the selection of splitting methods. In addition, the skills of patients and the specialties of therapy should be considered so that pharmaceutical counselling can be more effective regarding tablet-splitting. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. History, rare, and multiple events of mechanical unfolding of repeat proteins

    NASA Astrophysics Data System (ADS)

    Sumbul, Fidan; Marchesi, Arin; Rico, Felix

    2018-03-01

    Mechanical unfolding of proteins consisting of repeat domains is an excellent tool to obtain large statistics. Force spectroscopy experiments using atomic force microscopy on proteins presenting multiple domains have revealed that unfolding forces depend on the number of folded domains (history) and have reported intermediate states and rare events. However, the common use of unspecific attachment approaches to pull the protein of interest holds important limitations to study unfolding history and may lead to discarding rare and multiple probing events due to the presence of unspecific adhesion and uncertainty on the pulling site. Site-specific methods that have recently emerged minimize this uncertainty and would be excellent tools to probe unfolding history and rare events. However, detailed characterization of these approaches is required to identify their advantages and limitations. Here, we characterize a site-specific binding approach based on the ultrastable complex dockerin/cohesin III revealing its advantages and limitations to assess the unfolding history and to investigate rare and multiple events during the unfolding of repeated domains. We show that this approach is more robust, reproducible, and provides larger statistics than conventional unspecific methods. We show that the method is optimal to reveal the history of unfolding from the very first domain and to detect rare events, while being more limited to assess intermediate states. Finally, we quantify the forces required to unfold two molecules pulled in parallel, difficult when using unspecific approaches. The proposed method represents a step forward toward more reproducible measurements to probe protein unfolding history and opens the door to systematic probing of rare and multiple molecule unfolding mechanisms.

  18. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  19. Introduction of statistical information in a syntactic analyzer for document image recognition

    NASA Astrophysics Data System (ADS)

    Maroneze, André O.; Coüasnon, Bertrand; Lemaitre, Aurélie

    2011-01-01

    This paper presents an improvement to document layout analysis systems, offering a possible solution to Sayre's paradox (which states that an element "must be recognized before it can be segmented; and it must be segmented before it can be recognized"). This improvement, based on stochastic parsing, allows integration of statistical information, obtained from recognizers, during syntactic layout analysis. We present how this fusion of numeric and symbolic information in a feedback loop can be applied to syntactic methods to improve document description expressiveness. To limit combinatorial explosion during exploration of solutions, we devised an operator that allows optional activation of the stochastic parsing mechanism. Our evaluation on 1250 handwritten business letters shows this method allows the improvement of global recognition scores.

  20. An experimental bioactive dental ceramic for metal-ceramic restorations: Textural characteristics and investigation of the mechanical properties.

    PubMed

    Goudouri, Ourania-Menti; Kontonasaki, Eleana; Papadopoulou, Lambrini; Manda, Marianthi; Kavouras, Panagiotis; Triantafyllidis, Konstantinos S; Stefanidou, Maria; Koidis, Petros; Paraskevopoulos, Konstantinos M

    2017-02-01

    The aim of this study was the evaluation of the textural characteristics of an experimental sol-gel derived feldspathic dental ceramic, which has already been proven bioactive and the investigation of its flexural strength through Weibull Statistical Analysis. The null hypothesis was that the flexural strength of the experimental and the commercial dental ceramic would be of the same order, resulting in a dental ceramic with apatite forming ability and adequate mechanical integrity. Although the flexural strength of the experimental ceramics was not statistically significant different compared to the commercial one, the amount of blind pores due to processing was greater. The textural characteristics of the experimental ceramic were in accordance with the standard low porosity levels reported for dental ceramics used for fixed prosthetic restorations. Feldspathic dental ceramics with typical textural characteristics and advanced mechanical properties as well as enhanced apatite forming ability can be synthesized through the sol-gel method. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Variable Selection in the Presence of Missing Data: Imputation-based Methods.

    PubMed

    Zhao, Yize; Long, Qi

    2017-01-01

    Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.

  2. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    PubMed

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  3. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  4. The closure approximation in the hierarchy equations.

    NASA Technical Reports Server (NTRS)

    Adomian, G.

    1971-01-01

    The expectation of the solution process in a stochastic operator equation can be obtained from averaged equations only under very special circumstances. Conditions for validity are given and the significance and validity of the approximation in widely used hierarchy methods and the ?self-consistent field' approximation in nonequilibrium statistical mechanics are clarified. The error at any level of the hierarchy can be given and can be avoided by the use of the iterative method.

  5. Scalable privacy-preserving data sharing methodology for genome-wide association studies.

    PubMed

    Yu, Fei; Fienberg, Stephen E; Slavković, Aleksandra B; Uhler, Caroline

    2014-08-01

    The protection of privacy of individual-level information in genome-wide association study (GWAS) databases has been a major concern of researchers following the publication of "an attack" on GWAS data by Homer et al. (2008). Traditional statistical methods for confidentiality and privacy protection of statistical databases do not scale well to deal with GWAS data, especially in terms of guarantees regarding protection from linkage to external information. The more recent concept of differential privacy, introduced by the cryptographic community, is an approach that provides a rigorous definition of privacy with meaningful privacy guarantees in the presence of arbitrary external information, although the guarantees may come at a serious price in terms of data utility. Building on such notions, Uhler et al. (2013) proposed new methods to release aggregate GWAS data without compromising an individual's privacy. We extend the methods developed in Uhler et al. (2013) for releasing differentially-private χ(2)-statistics by allowing for arbitrary number of cases and controls, and for releasing differentially-private allelic test statistics. We also provide a new interpretation by assuming the controls' data are known, which is a realistic assumption because some GWAS use publicly available data as controls. We assess the performance of the proposed methods through a risk-utility analysis on a real data set consisting of DNA samples collected by the Wellcome Trust Case Control Consortium and compare the methods with the differentially-private release mechanism proposed by Johnson and Shmatikov (2013). Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Index of mechanical work in gait of children with cerebral palsy.

    PubMed

    Dziuba, Alicja Katarzyna; Tylkowska, Małgorzata; Jaroszczuk, Sebastian

    2014-01-01

    The pathological gait of children with cerebral palsy involves higher mechanical work, which limits their ability to function properly in society. Mechanical work is directly related to walking speed and, although a number of studies have been carried out in this field, few of them analysed the effect of the speed. The study aimed to develop standards for mechanical work during gait of children with cerebral palsy depending on the walking speed. The study covered 18 children with cerebral palsy and 14 healthy children. The BTS Smart software and the author's software were used to evaluate mechanical work, kinetic, potential and rotational energy connected with motion of the children body during walk. Compared to healthy subjects, mechanical work in children with cerebral palsy increases with the degree of disability. It can be expressed as a linear function of walking speed and shows strong and statistically significant correlations with walking gait. A negative statistically significant correlation between the degree of disability and walking speed can be observed. The highest contribution to the total mechanical energy during gait is from mechanical energy of the feet. Instantaneous value of rotational energy is 700 times lower than the instantaneous mechanical energy. An increase in walking speed causes the increase in the effect of the index of kinetic energy on total mechanical work. The method described can provide an objective supplementation for doctors and physical therapists to perform a simple and immediate diagnosis without much technical knowledge.

  7. Bootstrap calculation of ultimate strength temperature maxima for neutron irradiated ferritic/martensitic steels

    NASA Astrophysics Data System (ADS)

    Obraztsov, S. M.; Konobeev, Yu. V.; Birzhevoy, G. A.; Rachkov, V. I.

    2006-12-01

    The dependence of mechanical properties of ferritic/martensitic (F/M) steels on irradiation temperature is of interest because these steels are used as structural materials for fast, fusion reactors and accelerator driven systems. Experimental data demonstrating temperature peaks in physical and mechanical properties of neutron irradiated pure iron, nickel, vanadium, and austenitic stainless steels are available in the literature. A lack of such an information for F/M steels forces one to apply a computational mathematical-statistical modeling methods. The bootstrap procedure is one of such methods that allows us to obtain the necessary statistical characteristics using only a sample of limited size. In the present work this procedure is used for modeling the frequency distribution histograms of ultimate strength temperature peaks in pure iron and Russian F/M steels EP-450 and EP-823. Results of fitting the sums of Lorentz or Gauss functions to the calculated distributions are presented. It is concluded that there are two temperature (at 360 and 390 °C) peaks of the ultimate strength in EP-450 steel and single peak at 390 °C in EP-823.

  8. Analysis of swarm behaviors based on an inversion of the fluctuation theorem.

    PubMed

    Hamann, Heiko; Schmickl, Thomas; Crailsheim, Karl

    2014-01-01

    A grand challenge in the field of artificial life is to find a general theory of emergent self-organizing systems. In swarm systems most of the observed complexity is based on motion of simple entities. Similarly, statistical mechanics focuses on collective properties induced by the motion of many interacting particles. In this article we apply methods from statistical mechanics to swarm systems. We try to explain the emergent behavior of a simulated swarm by applying methods based on the fluctuation theorem. Empirical results indicate that swarms are able to produce negative entropy within an isolated subsystem due to frozen accidents. Individuals of a swarm are able to locally detect fluctuations of the global entropy measure and store them, if they are negative entropy productions. By accumulating these stored fluctuations over time the swarm as a whole is producing negative entropy and the system ends up in an ordered state. We claim that this indicates the existence of an inverted fluctuation theorem for emergent self-organizing dissipative systems. This approach bears the potential of general applicability.

  9. Sieve analysis in HIV-1 vaccine efficacy trials

    PubMed Central

    Edlefsen, Paul T.; Gilbert, Peter B.; Rolland, Morgane

    2013-01-01

    Purpose of review The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. Recent findings The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 and RV144, led to numerous studies in the last five years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Summary Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons while correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection. PMID:23719202

  10. Sieve analysis in HIV-1 vaccine efficacy trials.

    PubMed

    Edlefsen, Paul T; Gilbert, Peter B; Rolland, Morgane

    2013-09-01

    The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 (HIV Vaccine Trials Network-502) and RV144, led to numerous studies in the last 5 years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons, whereas correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection.

  11. Statistical inference approach to structural reconstruction of complex networks from binary time series

    NASA Astrophysics Data System (ADS)

    Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng

    2018-02-01

    Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.

  12. Statistical inference approach to structural reconstruction of complex networks from binary time series.

    PubMed

    Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng

    2018-02-01

    Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.

  13. Towards a statistical mechanical theory of active fluids.

    PubMed

    Marini Bettolo Marconi, Umberto; Maggi, Claudio

    2015-12-07

    We present a stochastic description of a model of N mutually interacting active particles in the presence of external fields and characterize its steady state behavior in the absence of currents. To reproduce the effects of the experimentally observed persistence of the trajectories of the active particles we consider a Gaussian force having a non-vanishing correlation time τ, whose finiteness is a measure of the activity of the system. With these ingredients we show that it is possible to develop a statistical mechanical approach similar to the one employed in the study of equilibrium liquids and to obtain the explicit form of the many-particle distribution function by means of the multidimensional unified colored noise approximation. Such a distribution plays a role analogous to the Gibbs distribution in equilibrium statistical mechanics and provides complete information about the microscopic state of the system. From here we develop a method to determine the one- and two-particle distribution functions in the spirit of the Born-Green-Yvon (BGY) equations of equilibrium statistical mechanics. The resulting equations which contain extra-correlations induced by the activity allow us to determine the stationary density profiles in the presence of external fields, the pair correlations and the pressure of active fluids. In the low density regime we obtained the effective pair potential ϕ(r) acting between two isolated particles separated by a distance, r, showing the existence of an effective attraction between them induced by activity. Based on these results, in the second half of the paper we propose a mean field theory as an approach simpler than the BGY hierarchy and use it to derive a van der Waals expression of the equation of state.

  14. Textural Analysis and Substrate Classification in the Nearshore Region of Lake Superior Using High-Resolution Multibeam Bathymetry

    NASA Astrophysics Data System (ADS)

    Dennison, Andrew G.

    Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.

  15. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students’ Statistical Reasoning and Quantitative Literacy Skills †

    PubMed Central

    Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549

  16. Incorporating an Interactive Statistics Workshop into an Introductory Biology Course-Based Undergraduate Research Experience (CURE) Enhances Students' Statistical Reasoning and Quantitative Literacy Skills.

    PubMed

    Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M

    2018-01-01

    Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.

  17. Effects of different mechanized soil fertilization methods on corn nutrient accumulation and yield

    NASA Astrophysics Data System (ADS)

    Shi, Qingwen; Bai, Chunming; Wang, Huixin; Wu, Di; Song, Qiaobo; Dong, Zengqi; Gao, Depeng; Dong, Qiping; Cheng, Xin; Zhang, Yahao; Mu, Jiahui; Chen, Qinghong; Liao, Wenqing; Qu, Tianru; Zhang, Chunling; Zhang, Xinyu; Liu, Yifei; Han, Xiaori

    2017-05-01

    Aim: Experiments for mechanized corn soil fertilization were conducted in Faku demonstration zone. On this basis, we studied effects on corn nutrient accumulation and yield traits at brown soil regions due to different mechanized soil fertilization measures. We also evaluated and optimized the regulation effects of mechanized soil fertilization for the purpose of crop yield increase and production efficiency improvement. Method: Based on the survey of soil background value in the demonstration zone, we collected plant samples during different corn growth periods to determine and make statistical analysis. Conclusions: Decomposed cow dung, when under mechanical broadcasting, was able to remarkably increase nitrogen and potassium accumulation content of corns at their ripe stage. Crushed stalk returning combined with deep tillage would remarkably increase phosphorus accumulation content of corn plants. When compared with top application, crushed stalk returning combined with deep tillage would remarkably increase corn thousand kernel weight (TKW). Mechanized broadcasting of granular organic fertilizer and crushed stalk returning combined with deep tillage, when compared with surface application, were able to boost corn yield in the in the demonstration zone.

  18. Cation solvation with quantum chemical effects modeled by a size-consistent multi-partitioning quantum mechanics/molecular mechanics method.

    PubMed

    Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi

    2017-07-21

    In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.

  19. Automatic stage identification of Drosophila egg chamber based on DAPI images

    PubMed Central

    Jia, Dongyu; Xu, Qiuping; Xie, Qian; Mio, Washington; Deng, Wu-Min

    2016-01-01

    The Drosophila egg chamber, whose development is divided into 14 stages, is a well-established model for developmental biology. However, visual stage determination can be a tedious, subjective and time-consuming task prone to errors. Our study presents an objective, reliable and repeatable automated method for quantifying cell features and classifying egg chamber stages based on DAPI images. The proposed approach is composed of two steps: 1) a feature extraction step and 2) a statistical modeling step. The egg chamber features used are egg chamber size, oocyte size, egg chamber ratio and distribution of follicle cells. Methods for determining the on-site of the polytene stage and centripetal migration are also discussed. The statistical model uses linear and ordinal regression to explore the stage-feature relationships and classify egg chamber stages. Combined with machine learning, our method has great potential to enable discovery of hidden developmental mechanisms. PMID:26732176

  20. [Cases and duration of mechanical ventilation in German hospitals : An analysis of DRG incentives and developments in respiratory medicine].

    PubMed

    Biermann, A; Geissler, A

    2016-09-01

    Diagnosis-related groups (DRGs) have been used to reimburse hospitals services in Germany since 2003/04. Like any other reimbursement system, DRGs offer specific incentives for hospitals that may lead to unintended consequences for patients. In the German context, specific procedures and their documentation are suspected to be primarily performed to increase hospital revenues. Mechanical ventilation of patients and particularly the duration of ventilation, which is an important variable for the DRG-classification, are often discussed to be among these procedures. The aim of this study was to examine incentives created by the German DRG-based payment system with regard to mechanical ventilation and to identify factors that explain the considerable increase of mechanically ventilated patients in recent years. Moreover, the assumption that hospitals perform mechanical ventilation in order to gain economic benefits was examined. In order to gain insights on the development of the number of mechanically ventilated patients, patient-level data provided by the German Federal Statistical Office and the German Institute for the Hospital Remuneration System were analyzed. The type of performed ventilation, the total number of ventilation hours, the age distribution, mortality and the DRG distribution for mechanical ventilation were calculated, using methods of descriptive and inferential statistics. Furthermore, changes in DRG-definitions and changes in respiratory medicine were compared for the years 2005-2012. Since the introduction of the DRG-based payment system in Germany, the hours of ventilation and the number of mechanically ventilated patients have substantially increased, while mortality has decreased. During the same period there has been a switch to less invasive ventilation methods. The age distribution has shifted to higher age-groups. A ventilation duration determined by DRG definitions could not be found. Due to advances in respiratory medicine, new ventilation methods have been introduced that are less prone to complications. This development has simultaneously improved survival rates. There was no evidence supporting the assumption that the duration of mechanical ventilation is influenced by the time intervals relevant for DRG grouping. However, presumably operational routines such as staff availability within early and late shifts of the hospital have a significant impact on the termination of mechanical ventilation.

  1. Experimental Determination of Dynamical Lee-Yang Zeros

    NASA Astrophysics Data System (ADS)

    Brandner, Kay; Maisi, Ville F.; Pekola, Jukka P.; Garrahan, Juan P.; Flindt, Christian

    2017-05-01

    Statistical physics provides the concepts and methods to explain the phase behavior of interacting many-body systems. Investigations of Lee-Yang zeros—complex singularities of the free energy in systems of finite size—have led to a unified understanding of equilibrium phase transitions. The ideas of Lee and Yang, however, are not restricted to equilibrium phenomena. Recently, Lee-Yang zeros have been used to characterize nonequilibrium processes such as dynamical phase transitions in quantum systems after a quench or dynamic order-disorder transitions in glasses. Here, we experimentally realize a scheme for determining Lee-Yang zeros in such nonequilibrium settings. We extract the dynamical Lee-Yang zeros of a stochastic process involving Andreev tunneling between a normal-state island and two superconducting leads from measurements of the dynamical activity along a trajectory. From the short-time behavior of the Lee-Yang zeros, we predict the large-deviation statistics of the activity which is typically difficult to measure. Our method paves the way for further experiments on the statistical mechanics of many-body systems out of equilibrium.

  2. [Study of beta-turns in globular proteins].

    PubMed

    Amirova, S R; Milchevskiĭ, Iu V; Filatov, I V; Esipova, N G; Tumanian, V G

    2005-01-01

    The formation of beta-turns in globular proteins has been studied by the method of molecular mechanics. Statistical method of discriminant analysis was applied to calculate energy components and sequences of oligopeptide segments, and after this prediction of I type beta-turns has been drawn. The accuracy of true positive prediction is 65%. Components of conformational energy considerably affecting beta-turn formation were delineated. There are torsional energy, energy of hydrogen bonds, and van der Waals energy.

  3. The statistical mechanics of complex signaling networks: nerve growth factor signaling

    NASA Astrophysics Data System (ADS)

    Brown, K. S.; Hill, C. C.; Calero, G. A.; Myers, C. R.; Lee, K. H.; Sethna, J. P.; Cerione, R. A.

    2004-10-01

    The inherent complexity of cellular signaling networks and their importance to a wide range of cellular functions necessitates the development of modeling methods that can be applied toward making predictions and highlighting the appropriate experiments to test our understanding of how these systems are designed and function. We use methods of statistical mechanics to extract useful predictions for complex cellular signaling networks. A key difficulty with signaling models is that, while significant effort is being made to experimentally measure the rate constants for individual steps in these networks, many of the parameters required to describe their behavior remain unknown or at best represent estimates. To establish the usefulness of our approach, we have applied our methods toward modeling the nerve growth factor (NGF)-induced differentiation of neuronal cells. In particular, we study the actions of NGF and mitogenic epidermal growth factor (EGF) in rat pheochromocytoma (PC12) cells. Through a network of intermediate signaling proteins, each of these growth factors stimulates extracellular regulated kinase (Erk) phosphorylation with distinct dynamical profiles. Using our modeling approach, we are able to predict the influence of specific signaling modules in determining the integrated cellular response to the two growth factors. Our methods also raise some interesting insights into the design and possible evolution of cellular systems, highlighting an inherent property of these systems that we call 'sloppiness.'

  4. The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model

    NASA Astrophysics Data System (ADS)

    Verkley, Wim; Severijns, Camiel

    2014-05-01

    Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).

  5. Statistics of dislocation pinning at localized obstacles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, A.; Bhattacharya, M., E-mail: mishreyee@vecc.gov.in; Barat, P.

    2014-10-14

    Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning ofmore » dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.« less

  6. Random matrices and condensation into multiple states

    NASA Astrophysics Data System (ADS)

    Sadeghi, Sina; Engel, Andreas

    2018-03-01

    In the present work, we employ methods from statistical mechanics of disordered systems to investigate static properties of condensation into multiple states in a general framework. We aim at showing how typical properties of random interaction matrices play a vital role in manifesting the statistics of condensate states. In particular, an analytical expression for the fraction of condensate states in the thermodynamic limit is provided that confirms the result of the mean number of coexisting species in a random tournament game. We also study the interplay between the condensation problem and zero-sum games with correlated random payoff matrices.

  7. A statistical method for determining the dimensions, tolerances and specification of optics for the Laser Megajoule facility (LMJ)

    NASA Astrophysics Data System (ADS)

    Denis, Vincent

    2008-09-01

    This paper presents a statistical method for determining the dimensions, tolerance and specifications of components for the Laser MegaJoule (LMJ). Numerous constraints inherent to a large facility require specific tolerances: the huge number of optical components; the interdependence of these components between the beams of same bundle; angular multiplexing for the amplifier section; distinct operating modes between the alignment and firing phases; the definition and use of alignment software in the place of classic optimization. This method provides greater flexibility to determine the positioning and manufacturing specifications of the optical components. Given the enormous power of the Laser MegaJoule (over 18 kJ in the infrared and 9 kJ in the ultraviolet), one of the major risks is damage the optical mounts and pollution of the installation by mechanical ablation. This method enables estimation of the beam occultation probabilities and quantification of the risks for the facility. All the simulations were run using the ZEMAX-EE optical design software.

  8. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  9. Decompression Mechanisms and Decompression Schedule Calculations.

    DTIC Science & Technology

    1984-01-20

    phisiology - The effects of altitude. Handbook of Physiology, Section 3: Respiration, Vol. II. W.O. Fenn and H. Rahn eds. Wash, D.C.; Am. Physiol. Soc. 1 4...decompression studies from other laboratories. METHODS Ten experienced and physically qualified divers ( ages 22-42) were compressed at a rate of 60...STATISTICS* --- ---------------------------------------------------------- EXPERIMENT N AGE (yr) HEIGHT (cm) WEIGHT (Kg) BODY FAT

  10. Statistics for Time-Series Spatial Data: Applying Survival Analysis to Study Land-Use Change

    ERIC Educational Resources Information Center

    Wang, Ninghua Nathan

    2013-01-01

    Traditional spatial analysis and data mining methods fall short of extracting temporal information from data. This inability makes their use difficult to study changes and the associated mechanisms of many geographic phenomena of interest, for example, land-use. On the other hand, the growing availability of land-change data over multiple time…

  11. Exploring the Use of Statistical Process Control Methods to Assess Course Changes

    ERIC Educational Resources Information Center

    Vollstedt, Ann-Marie

    2010-01-01

    This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to…

  12. Failure Mechanisms of GaAs Transistors - A Literature Survey

    DTIC Science & Technology

    1990-03-01

    doping profile cannot be as sharp as with epitaxial methods. This is the result of the statistics of the implantation and the general diffusion that...Speed GaAs Logic Gates 5.1 GaAs PLANAR TRANSITOR STRUCTURES USED IN IC’S Some planar transistor structures used in IC’s with examples of the

  13. Predicting Flood Hazards in Systems with Multiple Flooding Mechanisms

    NASA Astrophysics Data System (ADS)

    Luke, A.; Schubert, J.; Cheng, L.; AghaKouchak, A.; Sanders, B. F.

    2014-12-01

    Delineating flood zones in systems that are susceptible to flooding from a single mechanism (riverine flooding) is a relatively well defined procedure with specific guidance from agencies such as FEMA and USACE. However, there is little guidance in delineating flood zones in systems that are susceptible to flooding from multiple mechanisms such as storm surge, waves, tidal influence, and riverine flooding. In this study, a new flood mapping method which accounts for multiple extremes occurring simultaneously is developed and exemplified. The study site in which the method is employed is the Tijuana River Estuary (TRE) located in Southern California adjacent to the U.S./Mexico border. TRE is an intertidal coastal estuary that receives freshwater flows from the Tijuana River. Extreme discharge from the Tijuana River is the primary driver of flooding within TRE, however tide level and storm surge also play a significant role in flooding extent and depth. A comparison between measured flows at the Tijuana River and ocean levels revealed a correlation between extreme discharge and ocean height. Using a novel statistical method based upon extreme value theory, ocean heights were predicted conditioned up extreme discharge occurring within the Tijuana River. This statistical technique could also be applied to other systems in which different factors are identified as the primary drivers of flooding, such as significant wave height conditioned upon tide level, for example. Using the predicted ocean levels conditioned upon varying return levels of discharge as forcing parameters for the 2D hydraulic model BreZo, the 100, 50, 20, and 10 year floodplains were delineated. The results will then be compared to floodplains delineated using the standard methods recommended by FEMA for riverine zones with a downstream ocean boundary.

  14. Inverse tissue mechanics of cell monolayer expansion.

    PubMed

    Kondo, Yohei; Aoki, Kazuhiro; Ishii, Shin

    2018-03-01

    Living tissues undergo deformation during morphogenesis. In this process, cells generate mechanical forces that drive the coordinated cell motion and shape changes. Recent advances in experimental and theoretical techniques have enabled in situ measurement of the mechanical forces, but the characterization of mechanical properties that determine how these forces quantitatively affect tissue deformation remains challenging, and this represents a major obstacle for the complete understanding of morphogenesis. Here, we proposed a non-invasive reverse-engineering approach for the estimation of the mechanical properties, by combining tissue mechanics modeling and statistical machine learning. Our strategy is to model the tissue as a continuum mechanical system and to use passive observations of spontaneous tissue deformation and force fields to statistically estimate the model parameters. This method was applied to the analysis of the collective migration of Madin-Darby canine kidney cells, and the tissue flow and force were simultaneously observed by the phase contrast imaging and traction force microscopy. We found that our monolayer elastic model, whose elastic moduli were reverse-engineered, enabled a long-term forecast of the traction force fields when given the tissue flow fields, indicating that the elasticity contributes to the evolution of the tissue stress. Furthermore, we investigated the tissues in which myosin was inhibited by blebbistatin treatment, and observed a several-fold reduction in the elastic moduli. The obtained results validate our framework, which paves the way to the estimation of mechanical properties of living tissues during morphogenesis.

  15. Extreme value statistics analysis of fracture strengths of a sintered silicon nitride failing from pores

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1992-01-01

    Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.

  16. Conformational energy calculations on polypeptides and proteins: use of a statistical mechanical procedure for evaluating structure and properties.

    PubMed

    Scheraga, H A; Paine, G H

    1986-01-01

    We are using a variety of theoretical and computational techniques to study protein structure, protein folding, and higher-order structures. Our earlier work involved treatments of liquid water and aqueous solutions of nonpolar and polar solutes, computations of the stabilities of the fundamental structures of proteins and their packing arrangements, conformations of small cyclic and open-chain peptides, structures of fibrous proteins (collagen), structures of homologous globular proteins, introduction of special procedures as constraints during energy minimization of globular proteins, and structures of enzyme-substrate complexes. Recently, we presented a new methodology for predicting polypeptide structure (described here); the method is based on the calculation of the probable and average conformation of a polypeptide chain by the application of equilibrium statistical mechanics in conjunction with an adaptive, importance sampling Monte Carlo algorithm. As a test, it was applied to Met-enkephalin.

  17. Fluorescent biopsy of biological tissues in differentiation of benign and malignant tumors of prostate

    NASA Astrophysics Data System (ADS)

    Trifoniuk, L. I.; Ushenko, Yu. A.; Sidor, M. I.; Minzer, O. P.; Gritsyuk, M. V.; Novakovskaya, O. Y.

    2014-08-01

    The work consists of investigation results of diagnostic efficiency of a new azimuthally stable Mueller-matrix method of analysis of laser autofluorescence coordinate distributions of biological tissues histological sections. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of histological sections of uterus wall tumor - group 1 (dysplasia) and group 2 (adenocarcinoma) are estimated.

  18. Statistical Mechanics Model of Solids with Defects

    NASA Astrophysics Data System (ADS)

    Kaufman, M.; Walters, P. A.; Ferrante, J.

    1997-03-01

    Previously(M.Kaufman, J.Ferrante,NASA Tech. Memor.,1996), we examined the phase diagram for the failure of a solid under isotropic expansion and compression as a function of stress and temperature with the "springs" modelled by the universal binding energy relation (UBER)(J.H.Rose, J.R.Smith, F.Guinea, J.Ferrante, Phys.Rev.B29, 2963 (1984)). In the previous calculation we assumed that the "springs" failed independently and that the strain is uniform. In the present work, we have extended this statistical model of mechanical failure by allowing for correlations between "springs" and for thermal fluctuations in strains. The springs are now modelled in the harmonic approximation with a failure threshold energy E0, as an intermediate step in future studies to reinclude the full non-linear dependence of the UBER for modelling the interactions. We use the Migdal-Kadanoff renormalization-group method to determine the phase diagram of the model and to compute the free energy.

  19. System of polarization correlometry of polycrystalline layers of urine in the differentiation stage of diabetes

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. O.; Pashkovskaya, N. V.; Marchuk, Y. F.; Dubolazov, O. V.; Savich, V. O.

    2015-08-01

    The work consists of investigation results of diagnostic efficiency of a new azimuthally stable Muellermatrix method of analysis of laser autofluorescence coordinate distributions of biological liquid layers. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of human urine polycrystalline layers for the sake of diagnosing and differentiating cholelithiasis with underlying chronic cholecystitis (group 1) and diabetes mellitus of degree II (group 2) are estimated.

  20. Mueller-matrix of laser-induced autofluorescence of polycrystalline films of dried peritoneal fluid in diagnostics of endometriosis

    NASA Astrophysics Data System (ADS)

    Ushenko, Yuriy A.; Koval, Galina D.; Ushenko, Alexander G.; Dubolazov, Olexander V.; Ushenko, Vladimir A.; Novakovskaia, Olga Yu.

    2016-07-01

    This research presents investigation results of the diagnostic efficiency of an azimuthally stable Mueller-matrix method of analysis of laser autofluorescence of polycrystalline films of dried uterine cavity peritoneal fluid. A model of the generalized optical anisotropy of films of dried peritoneal fluid is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase (linear and circular birefringence) and amplitude (linear and circular dichroism) anisotropies is taken into consideration. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistical analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the first to the fourth order) of differentiation of polycrystalline films of dried peritoneal fluid, group 1 (healthy donors) and group 2 (uterus endometriosis patients), are determined.

  1. Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu

    1995-11-01

    This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.

  2. Statistical mechanics explanation for the structure of ocean eddies and currents

    NASA Astrophysics Data System (ADS)

    Venaille, A.; Bouchet, F.

    2010-12-01

    The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.

  3. Comparison of molecular mechanics-Poisson-Boltzmann surface area (MM-PBSA) and molecular mechanics-three-dimensional reference interaction site model (MM-3D-RISM) method to calculate the binding free energy of protein-ligand complexes: Effect of metal ion and advance statistical test

    NASA Astrophysics Data System (ADS)

    Pandey, Preeti; Srivastava, Rakesh; Bandyopadhyay, Pradipta

    2018-03-01

    The relative performance of MM-PBSA and MM-3D-RISM methods to estimate the binding free energy of protein-ligand complexes is investigated by applying these to three proteins (Dihydrofolate Reductase, Catechol-O-methyltransferase, and Stromelysin-1) differing in the number of metal ions they contain. None of the computational methods could distinguish all the ligands based on their calculated binding free energies (as compared to experimental values). The difference between the two comes from both polar and non-polar part of solvation. For charged ligand case, MM-PBSA and MM-3D-RISM give a qualitatively different result for the polar part of solvation.

  4. Wave packet and statistical quantum calculations for the He + NeH⁺ → HeH⁺ + Ne reaction on the ground electronic state.

    PubMed

    Koner, Debasish; Barrios, Lizandra; González-Lezana, Tomás; Panda, Aditya N

    2014-09-21

    A real wave packet based time-dependent method and a statistical quantum method have been used to study the He + NeH(+) (v, j) reaction with the reactant in various ro-vibrational states, on a recently calculated ab initio ground state potential energy surface. Both the wave packet and statistical quantum calculations were carried out within the centrifugal sudden approximation as well as using the exact Hamiltonian. Quantum reaction probabilities exhibit dense oscillatory pattern for smaller total angular momentum values, which is a signature of resonances in a complex forming mechanism for the title reaction. Significant differences, found between exact and approximate quantum reaction cross sections, highlight the importance of inclusion of Coriolis coupling in the calculations. Statistical results are in fairly good agreement with the exact quantum results, for ground ro-vibrational states of the reactant. Vibrational excitation greatly enhances the reaction cross sections, whereas rotational excitation has relatively small effect on the reaction. The nature of the reaction cross section curves is dependent on the initial vibrational state of the reactant and is typical of a late barrier type potential energy profile.

  5. The difference between a dynamic and mechanical approach to stroke treatment.

    PubMed

    Helgason, Cathy M

    2007-06-01

    The current classification of stroke is based on causation, also called pathogenesis, and relies on binary logic faithful to the Aristotelian tradition. Accordingly, a pathology is or is not the cause of the stroke, is considered independent of others, and is the target for treatment. It is the subject for large double-blind randomized clinical therapeutic trials. The scientific view behind clinical trials is the fundamental concept that information is statistical, and causation is determined by probabilities. Therefore, the cause and effect relation will be determined by probability-theory-based statistics. This is the basis of evidence-based medicine, which calls for the results of such trials to be the basis for physician decisions regarding diagnosis and treatment. However, there are problems with the methodology behind evidence-based medicine. Calculations using probability-theory-based statistics regarding cause and effect are performed within an automatic system where there are known inputs and outputs. This method of research provides a framework of certainty with no surprise elements or outcomes. However, it is not a system or method that will come up with previously unknown variables, concepts, or universal principles; it is not a method that will give a new outcome; and it is not a method that allows for creativity, expertise, or new insight for problem solving.

  6. Heliox Improves Carbon Dioxide Removal during Lung Protective Mechanical Ventilation.

    PubMed

    Beurskens, Charlotte J; Brevoord, Daniel; Lagrand, Wim K; van den Bergh, Walter M; Vroom, Margreeth B; Preckel, Benedikt; Horn, Janneke; Juffermans, Nicole P

    2014-01-01

    Introduction. Helium is a noble gas with low density and increased carbon dioxide (CO2) diffusion capacity. This allows lower driving pressures in mechanical ventilation and increased CO2 diffusion. We hypothesized that heliox facilitates ventilation in patients during lung-protective mechanical ventilation using low tidal volumes. Methods. This is an observational cohort substudy of a single arm intervention study. Twenty-four ICU patients were included, who were admitted after a cardiac arrest and mechanically ventilated for 3 hours with heliox (50% helium; 50% oxygen). A fixed protective ventilation protocol (6 mL/kg) was used, with prospective observation for changes in lung mechanics and gas exchange. Statistics was by Bonferroni post-hoc correction with statistical significance set at P < 0.017. Results. During heliox ventilation, respiratory rate decreased (25 ± 4 versus 23 ± 5 breaths min(-1), P = 0.010). Minute volume ventilation showed a trend to decrease compared to baseline (11.1 ± 1.9 versus 9.9 ± 2.1 L min(-1), P = 0.026), while reducing PaCO2 levels (5.0 ± 0.6 versus 4.5 ± 0.6 kPa, P = 0.011) and peak pressures (21.1 ± 3.3 versus 19.8 ± 3.2 cm H2O, P = 0.024). Conclusions. Heliox improved CO2 elimination while allowing reduced minute volume ventilation in adult patients during protective mechanical ventilation.

  7. Detection of Bursts and Pauses in Spike Trains

    PubMed Central

    Ko, D.; Wilson, C. J.; Lobb, C. J.; Paladini, C. A.

    2012-01-01

    Midbrain dopaminergic neurons in vivo exhibit a wide range of firing patterns. They normally fire constantly at a low rate, and speed up, firing a phasic burst when reward exceeds prediction, or pause when an expected reward does not occur. Therefore, the detection of bursts and pauses from spike train data is a critical problem when studying the role of phasic dopamine (DA) in reward related learning, and other DA dependent behaviors. However, few statistical methods have been developed that can identify bursts and pauses simultaneously. We propose a new statistical method, the Robust Gaussian Surprise (RGS) method, which performs an exhaustive search of bursts and pauses in spike trains simultaneously. We found that the RGS method is adaptable to various patterns of spike trains recorded in vivo, and is not influenced by baseline firing rate, making it applicable to all in vivo spike trains where baseline firing rates vary over time. We compare the performance of the RGS method to other methods of detecting bursts, such as the Poisson Surprise (PS), Rank Surprise (RS), and Template methods. Analysis of data using the RGS method reveals potential mechanisms underlying how bursts and pauses are controlled in DA neurons. PMID:22939922

  8. Eigenvector centrality is a metric of elastomer modulus, heterogeneity, and damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, Jr., Paul Michael; Welch, Cynthia F.

    Here, we present an application of eigenvector centrality to encode the connectivity of polymer networks resolved at the micro- and meso-scopic length scales. This method captures the relative importance of different nodes within the network structure and provides a route toward the development of a statistical mechanics model that correlates connectivity with mechanical response. This scheme may be informed by analytical and semi-analytical models for the network structure, or through direct experimental examination. It may be used to predict the reduction in mechanical performance for heterogeneous materials subjected to specific modes of damage. Here, we develop the method and demonstratemore » that it leads to the prediction of established trends in elastomers. We also apply the model to the case of a self-healing polymer network reported in the literature, extracting insight about the fraction of bonds broken and re-formed during strain and recovery.« less

  9. Eigenvector centrality is a metric of elastomer modulus, heterogeneity, and damage

    DOE PAGES

    Welch, Jr., Paul Michael; Welch, Cynthia F.

    2017-04-27

    Here, we present an application of eigenvector centrality to encode the connectivity of polymer networks resolved at the micro- and meso-scopic length scales. This method captures the relative importance of different nodes within the network structure and provides a route toward the development of a statistical mechanics model that correlates connectivity with mechanical response. This scheme may be informed by analytical and semi-analytical models for the network structure, or through direct experimental examination. It may be used to predict the reduction in mechanical performance for heterogeneous materials subjected to specific modes of damage. Here, we develop the method and demonstratemore » that it leads to the prediction of established trends in elastomers. We also apply the model to the case of a self-healing polymer network reported in the literature, extracting insight about the fraction of bonds broken and re-formed during strain and recovery.« less

  10. Quantum approach to classical statistical mechanics.

    PubMed

    Somma, R D; Batista, C D; Ortiz, G

    2007-07-20

    We present a new approach to study the thermodynamic properties of d-dimensional classical systems by reducing the problem to the computation of ground state properties of a d-dimensional quantum model. This classical-to-quantum mapping allows us to extend the scope of standard optimization methods by unifying them under a general framework. The quantum annealing method is naturally extended to simulate classical systems at finite temperatures. We derive the rates to assure convergence to the optimal thermodynamic state using the adiabatic theorem of quantum mechanics. For simulated and quantum annealing, we obtain the asymptotic rates of T(t) approximately (pN)/(k(B)logt) and gamma(t) approximately (Nt)(-c/N), for the temperature and magnetic field, respectively. Other annealing strategies are also discussed.

  11. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    PubMed

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  12. [Role of redox- and hormonal metabolism in the mechanisms of skin aging].

    PubMed

    Berianidze, K; Katsitadze, A; Jalaghania, N; Sanikidze, T

    2014-10-01

    The aim of the study was to investigate the role of redox balance in the pathogenesis of skin aging in menopausal women. 30 menopausal women aged 40 to 55 years and 30 reproductive women aged 25 to 35 years were studied. Qualitative assessment of the skin (moisture, fat, elasticity) was performed; in the venous blood hormonal metabolism indicators: estradiole - E, testosterone - T, follicle stimulating hormone - FSH and redox parameters - oxygen and lipid free radical content (EPR method), antioxidant enzymes (catalase, superoxide dismutase (SOD) and glutationreducrase (GR)) activity (spectroscopic method) were studied. According results of the study, in menopausal women statistically significant loss of skin elasticity and increase the number of pores was revealed in comparison to the reproductive women. These changes occur against the background of statistically significant increase of the blood testosterone and FSH content; estradiol in women menopausal period has tendency to decrease. Redox indicators of blood did not differ statistically significant in women of reproductive and menopausal period, although there was a tendency to increase the activity of catalase and GR in menopausal women period, indicating on the intensification of oxidative processes in this age group. Statistically significant negative correlation between blood estradiole content and SOD's activity (r=-0.413, p=0.0017) and positive correlation between blood estradiole content and GR activity (r=0.565, p=0.002) was revealed. Decrease in the estradiol concentration and disbalance in redox-system in the women's blood correlats with the rate of pigmented spots growth and decrease of the skin moisture. It is concluded that in mechanisms of skin aging of menopausal women estrogen-depending alterations in redox-balance places important role.

  13. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  14. Biomechanical analysis using FEA and experiments of a standard plate method versus three cable methods for fixing acetabular fractures with simultaneous THA.

    PubMed

    Aziz, Mina S R; Dessouki, Omar; Samiezadeh, Saeid; Bougherara, Habiba; Schemitsch, Emil H; Zdero, Radovan

    2017-08-01

    Acetabular fractures potentially account for up to half of all pelvic fractures, while pelvic fractures potentially account for over one-tenth of all human bone fractures. This is the first biomechanical study to assess acetabular fracture fixation using plates versus cables in the presence of a total hip arthroplasty, as done for the elderly. In Phase 1, finite element (FE) models compared a standard plate method versus 3 cable methods for repairing an acetabular fracture (type: anterior column plus posterior hemi-transverse) subjected to a physiological-type compressive load of 2207N representing 3 x body weight for a 75kg person during walking. FE stress maps were compared to choose the most mechanically stable cable method, i.e. lowest peak bone stress. In Phase 2, mechanical tests were then done in artificial hemipelvises to compare the standard plate method versus the optimal cable method selected from Phase 1. FE analysis results showed peak bone stresses of 255MPa (Plate method), 205MPa (Mears cable method), 250MPa (Kang cable method), and 181MPa (Mouhsine cable method). Mechanical tests then showed that the Plate method versus the Mouhsine cable method selected from Phase 1 had higher stiffness (662versus 385N/mm, p=0.001), strength (3210versus 2060N, p=0.009), and failure energy (8.8versus 6.2J, p=0.002), whilst they were statistically equivalent for interfragmentary sliding (p≥0.179) and interfragmentary gapping (p≥0.08). The Plate method had superior mechanical properties, but the Mouhsine cable method may be a reasonable alternative if osteoporosis prevents good screw thread interdigitation during plating. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  15. Ergodic theorem, ergodic theory, and statistical mechanics

    PubMed Central

    Moore, Calvin C.

    2015-01-01

    This perspective highlights the mean ergodic theorem established by John von Neumann and the pointwise ergodic theorem established by George Birkhoff, proofs of which were published nearly simultaneously in PNAS in 1931 and 1932. These theorems were of great significance both in mathematics and in statistical mechanics. In statistical mechanics they provided a key insight into a 60-y-old fundamental problem of the subject—namely, the rationale for the hypothesis that time averages can be set equal to phase averages. The evolution of this problem is traced from the origins of statistical mechanics and Boltzman's ergodic hypothesis to the Ehrenfests' quasi-ergodic hypothesis, and then to the ergodic theorems. We discuss communications between von Neumann and Birkhoff in the Fall of 1931 leading up to the publication of these papers and related issues of priority. These ergodic theorems initiated a new field of mathematical-research called ergodic theory that has thrived ever since, and we discuss some of recent developments in ergodic theory that are relevant for statistical mechanics. PMID:25691697

  16. A statistical mechanical approach to restricted integer partition functions

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-05-01

    The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.

  17. Toward Determining ATPase Mechanism in ABC Transporters: Development of the Reaction Path–Force Matching QM/MM Method

    PubMed Central

    Zhou, Y.; Ojeda-May, P.; Nagaraju, M.; Pu, J.

    2016-01-01

    Adenosine triphosphate (ATP)-binding cassette (ABC) transporters are ubiquitous ATP-dependent membrane proteins involved in translocations of a wide variety of substrates across cellular membranes. To understand the chemomechanical coupling mechanism as well as functional asymmetry in these systems, a quantitative description of how ABC transporters hydrolyze ATP is needed. Complementary to experimental approaches, computer simulations based on combined quantum mechanical and molecular mechanical (QM/MM) potentials have provided new insights into the catalytic mechanism in ABC transporters. Quantitatively reliable determination of the free energy requirement for enzymatic ATP hydrolysis, however, requires substantial statistical sampling on QM/MM potential. A case study shows that brute force sampling of ab initio QM/MM (AI/MM) potential energy surfaces is computationally impractical for enzyme simulations of ABC transporters. On the other hand, existing semiempirical QM/MM (SE/MM) methods, although affordable for free energy sampling, are unreliable for studying ATP hydrolysis. To close this gap, a multiscale QM/MM approach named reaction path–force matching (RP–FM) has been developed. In RP–FM, specific reaction parameters for a selected SE method are optimized against AI reference data along reaction paths by employing the force matching technique. The feasibility of the method is demonstrated for a proton transfer reaction in the gas phase and in solution. The RP–FM method may offer a general tool for simulating complex enzyme systems such as ABC transporters. PMID:27498639

  18. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  19. Hybrid Quantum Mechanics/Molecular Mechanics Solvation Scheme for Computing Free Energies of Reactions at Metal-Water Interfaces.

    PubMed

    Faheem, Muhammad; Heyden, Andreas

    2014-08-12

    We report the development of a quantum mechanics/molecular mechanics free energy perturbation (QM/MM-FEP) method for modeling chemical reactions at metal-water interfaces. This novel solvation scheme combines planewave density function theory (DFT), periodic electrostatic embedded cluster method (PEECM) calculations using Gaussian-type orbitals, and classical molecular dynamics (MD) simulations to obtain a free energy description of a complex metal-water system. We derive a potential of mean force (PMF) of the reaction system within the QM/MM framework. A fixed-size, finite ensemble of MM conformations is used to permit precise evaluation of the PMF of QM coordinates and its gradient defined within this ensemble. Local conformations of adsorbed reaction moieties are optimized using sequential MD-sampling and QM-optimization steps. An approximate reaction coordinate is constructed using a number of interpolated states and the free energy difference between adjacent states is calculated using the QM/MM-FEP method. By avoiding on-the-fly QM calculations and by circumventing the challenges associated with statistical averaging during MD sampling, a computational speedup of multiple orders of magnitude is realized. The method is systematically validated against the results of ab initio QM calculations and demonstrated for C-C cleavage in double-dehydrogenated ethylene glycol on a Pt (111) model surface.

  20. Evolutionary model selection and parameter estimation for protein-protein interaction network based on differential evolution algorithm

    PubMed Central

    Huang, Lei; Liao, Li; Wu, Cathy H.

    2016-01-01

    Revealing the underlying evolutionary mechanism plays an important role in understanding protein interaction networks in the cell. While many evolutionary models have been proposed, the problem about applying these models to real network data, especially for differentiating which model can better describe evolutionary process for the observed network urgently remains as a challenge. The traditional way is to use a model with presumed parameters to generate a network, and then evaluate the fitness by summary statistics, which however cannot capture the complete network structures information and estimate parameter distribution. In this work we developed a novel method based on Approximate Bayesian Computation and modified Differential Evolution (ABC-DEP) that is capable of conducting model selection and parameter estimation simultaneously and detecting the underlying evolutionary mechanisms more accurately. We tested our method for its power in differentiating models and estimating parameters on the simulated data and found significant improvement in performance benchmark, as compared with a previous method. We further applied our method to real data of protein interaction networks in human and yeast. Our results show Duplication Attachment model as the predominant evolutionary mechanism for human PPI networks and Scale-Free model as the predominant mechanism for yeast PPI networks. PMID:26357273

  1. Rydberg Atoms in Strong Fields: a Testing Ground for Quantum Chaos.

    NASA Astrophysics Data System (ADS)

    Courtney, Michael

    1995-01-01

    Rydberg atoms in strong static electric and magnetic fields provide experimentally accessible systems for studying the connections between classical chaos and quantum mechanics in the semiclassical limit. This experimental accessibility has motivated the development of reliable quantum mechanical solutions. This thesis uses both experimental and computed quantum spectra to test the central approaches to quantum chaos. These central approaches consist mainly of developing methods to compute the spectra of quantum systems in non -perturbative regimes, correlating statistical descriptions of eigenvalues with the classical behavior of the same Hamiltonian, and the development of semiclassical methods such as periodic-orbit theory. Particular emphasis is given to identifying the spectral signature of recurrences --quantum wave packets which follow classical orbits. The new findings include: the breakdown of the connection between energy-level statistics and classical chaos in odd-parity diamagnetic lithium, the discovery of the signature of very long period orbits in atomic spectra, quantitative evidence for the scattering of recurrences by the alkali -metal core, quantitative description of the behavior of recurrences near bifurcations, and a semiclassical interpretation of the evolution of continuum Stark spectra. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).

  2. Computational methods in the exploration of the classical and statistical mechanics of celestial scale strings: Rotating Space Elevators

    NASA Astrophysics Data System (ADS)

    Knudsen, Steven; Golubovic, Leonardo

    2015-04-01

    With the advent of ultra-strong materials, the Space Elevator has changed from science fiction to real science. We discuss computational and theoretical methods we developed to explore classical and statistical mechanics of rotating Space Elevators (RSE). An RSE is a loopy string reaching deep into outer space. The floppy RSE loop executes a motion which is nearly a superposition of two rotations: geosynchronous rotation around the Earth, and yet another faster rotational motion of the string which goes on around a line perpendicular to the Earth at its equator. Strikingly, objects sliding along the RSE loop spontaneously oscillate between two turning points, one of which is close to the Earth (starting point) whereas the other one is deeply in the outer space. The RSE concept thus solves a major problem in space elevator science which is how to supply energy to the climbers moving along space elevator strings. The exploration of the dynamics of a floppy string interacting with objects sliding along it has required development of novel finite element algorithms described in this presentation. We thank Prof. Duncan Lorimer of WVU for kindly providing us access to his computational facility.

  3. Applied Mathematical Methods in Theoretical Physics

    NASA Astrophysics Data System (ADS)

    Masujima, Michio

    2005-04-01

    All there is to know about functional analysis, integral equations and calculus of variations in a single volume. This advanced textbook is divided into two parts: The first on integral equations and the second on the calculus of variations. It begins with a short introduction to functional analysis, including a short review of complex analysis, before continuing a systematic discussion of different types of equations, such as Volterra integral equations, singular integral equations of Cauchy type, integral equations of the Fredholm type, with a special emphasis on Wiener-Hopf integral equations and Wiener-Hopf sum equations. After a few remarks on the historical development, the second part starts with an introduction to the calculus of variations and the relationship between integral equations and applications of the calculus of variations. It further covers applications of the calculus of variations developed in the second half of the 20th century in the fields of quantum mechanics, quantum statistical mechanics and quantum field theory. Throughout the book, the author presents over 150 problems and exercises -- many from such branches of physics as quantum mechanics, quantum statistical mechanics, and quantum field theory -- together with outlines of the solutions in each case. Detailed solutions are given, supplementing the materials discussed in the main text, allowing problems to be solved making direct use of the method illustrated. The original references are given for difficult problems. The result is complete coverage of the mathematical tools and techniques used by physicists and applied mathematicians Intended for senior undergraduates and first-year graduates in science and engineering, this is equally useful as a reference and self-study guide.

  4. Chapter two: Phenomenology of tsunamis II: scaling, event statistics, and inter-event triggering

    USGS Publications Warehouse

    Geist, Eric L.

    2012-01-01

    Observations related to tsunami catalogs are reviewed and described in a phenomenological framework. An examination of scaling relationships between earthquake size (as expressed by scalar seismic moment and mean slip) and tsunami size (as expressed by mean and maximum local run-up and maximum far-field amplitude) indicates that scaling is significant at the 95% confidence level, although there is uncertainty in how well earthquake size can predict tsunami size (R2 ~ 0.4-0.6). In examining tsunami event statistics, current methods used to estimate the size distribution of earthquakes and landslides and the inter-event time distribution of earthquakes are first reviewed. These methods are adapted to estimate the size and inter-event distribution of tsunamis at a particular recording station. Using a modified Pareto size distribution, the best-fit power-law exponents of tsunamis recorded at nine Pacific tide-gauge stations exhibit marked variation, in contrast to the approximately constant power-law exponent for inter-plate thrust earthquakes. With regard to the inter-event time distribution, significant temporal clustering of tsunami sources is demonstrated. For tsunami sources occurring in close proximity to other sources in both space and time, a physical triggering mechanism, such as static stress transfer, is a likely cause for the anomalous clustering. Mechanisms of earthquake-to-earthquake and earthquake-to-landslide triggering are reviewed. Finally, a modification of statistical branching models developed for earthquake triggering is introduced to describe triggering among tsunami sources.

  5. Statistical mechanics of complex neural systems and high dimensional data

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-03-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

  6. Hybrid PSO-ASVR-based method for data fitting in the calibration of infrared radiometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Sen; Li, Chengwei, E-mail: heikuanghit@163.com

    2016-06-15

    The present paper describes a hybrid particle swarm optimization-adaptive support vector regression (PSO-ASVR)-based method for data fitting in the calibration of infrared radiometer. The proposed hybrid PSO-ASVR-based method is based on PSO in combination with Adaptive Processing and Support Vector Regression (SVR). The optimization technique involves setting parameters in the ASVR fitting procedure, which significantly improves the fitting accuracy. However, its use in the calibration of infrared radiometer has not yet been widely explored. Bearing this in mind, the PSO-ASVR-based method, which is based on the statistical learning theory, is successfully used here to get the relationship between the radiationmore » of a standard source and the response of an infrared radiometer. Main advantages of this method are the flexible adjustment mechanism in data processing and the optimization mechanism in a kernel parameter setting of SVR. Numerical examples and applications to the calibration of infrared radiometer are performed to verify the performance of PSO-ASVR-based method compared to conventional data fitting methods.« less

  7. SHARE: system design and case studies for statistical health information release

    PubMed Central

    Gardner, James; Xiong, Li; Xiao, Yonghui; Gao, Jingjing; Post, Andrew R; Jiang, Xiaoqian; Ohno-Machado, Lucila

    2013-01-01

    Objectives We present SHARE, a new system for statistical health information release with differential privacy. We present two case studies that evaluate the software on real medical datasets and demonstrate the feasibility and utility of applying the differential privacy framework on biomedical data. Materials and Methods SHARE releases statistical information in electronic health records with differential privacy, a strong privacy framework for statistical data release. It includes a number of state-of-the-art methods for releasing multidimensional histograms and longitudinal patterns. We performed a variety of experiments on two real datasets, the surveillance, epidemiology and end results (SEER) breast cancer dataset and the Emory electronic medical record (EeMR) dataset, to demonstrate the feasibility and utility of SHARE. Results Experimental results indicate that SHARE can deal with heterogeneous data present in medical data, and that the released statistics are useful. The Kullback–Leibler divergence between the released multidimensional histograms and the original data distribution is below 0.5 and 0.01 for seven-dimensional and three-dimensional data cubes generated from the SEER dataset, respectively. The relative error for longitudinal pattern queries on the EeMR dataset varies between 0 and 0.3. While the results are promising, they also suggest that challenges remain in applying statistical data release using the differential privacy framework for higher dimensional data. Conclusions SHARE is one of the first systems to provide a mechanism for custodians to release differentially private aggregate statistics for a variety of use cases in the medical domain. This proof-of-concept system is intended to be applied to large-scale medical data warehouses. PMID:23059729

  8. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  9. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    ERIC Educational Resources Information Center

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  10. PREFACE: International Workshop on Statistical-Mechanical Informatics 2008 (IW-SMI 2008)

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Inoue, Jun-ichi; Kabashima, Yoshiyuki; Tanaka, Kazuyuki

    2009-01-01

    Statistical mechanical informatics (SMI) is an approach that applies physics to information science, in which many-body problems in information processing are tackled using statistical mechanics methods. In the last decade, the use of SMI has resulted in great advances in research into classical information processing, in particular, theories of information and communications, probabilistic inference and combinatorial optimization problems. It is expected that the success of SMI can be extended to quantum systems. The importance of many-body problems is also being recognized in quantum information theory (QIT), for which quantification of entanglement of bipartite systems has recently been almost completely established after considerable effort. SMI and QIT are sufficiently well developed that it is now appropriate to consider applying SMI to quantum systems and developing many-body theory in QIT. This combination of SMI and QIT is highly likely to contribute significantly to the development of both research fields. The International Workshop on Statistical-Mechanical Informatics has been organized in response to this situation. This workshop, held at Sendai International Conference Center, Sendai, Japan, 14-17 September 2008, and sponsored by the Grant-in-Aid for Scientific Research on Priority Areas `Deepening and Expansion of Statistical Mechanical Informatics (DEX-SMI)' (Head investigator: Yoshiyuki Kabashima, Tokyo Institute of Technology) (Project http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI), was intended to provide leading researchers with strong interdisciplinary interests in QIT and SMI with the opportunity to engage in intensive discussions. The aim of the workshop was to expand SMI to quantum systems and QIT research on quantum (entangled) many-body systems, to discuss possible future directions, and to offer researchers the opportunity to exchange ideas that may lead to joint research initiatives. We would like to thank the contributors of the workshop as well as all the participants, who have enjoyed the workshop as well as their stay in Sendai, one of the most beautiful cities in Japan. This successful workshop will stimulate further development of the interdisciplinary research field of QIT and SMI. Masahito Hayashi, Jun-ichi Inoue, Yoshiyuki Kabashima and Kazuyuki Tanaka Editors The IW-SMI 2008 Organizing Committee Kazuyuki Tanaka, General Chair (Tohoku University) Yoshiyuki Kabashima, Vice-General Chair (Tokyo Institute of Technology) Jun-ichi Inoue, Program Chair (Hokkaido University) Masahito Hayashi, Pulications Chair (Tohoku University) Hidetoshi Nishimori (Tokyo Institute of Technology) Toshiyuki Tanaka (Kyoto University)

  11. Climate variability, vulnerability, and coping mechanism in Alaknanda catchment, Central Himalaya, India.

    PubMed

    Kumar, Kireet; Joshi, Sneh; Joshi, Varun

    2008-06-01

    A study was carried out to discover trends in the rainfall and temperature pattern of the Alaknanda catchment in the Central Himalaya. Data on the annual rainfall, monsoon rainfall for the last decade, and average annual temperatures over the last few decades were analyzed. Nonparametric methods (Mann-Kendall and Sen's method) were employed to identify trends. The Mann-Kendall test shows a decline in rainfall and rise in temperature, and these trends were found to be statistically significant at the 95% confidence level for both transects. Sen's method also confirms this trend. This aspect has to be considered seriously for the simple reason that if the same trend continues in the future, more chances of drought are expected. The impact of climate change has been well perceived by the people of the catchment, and a coping mechanism has been developed at the local level.

  12. Effects of Microstructural Variability on Thermo-Mechanical Properties of a Woven Ceramic Matrix Composite

    NASA Technical Reports Server (NTRS)

    Goldsmith, Marlana B.; Sankar, Bhavani V.; Haftka, Raphael T.; Goldberg, Robert K.

    2013-01-01

    The objectives of this paper include identifying important architectural parameters that describe the SiC/SiC five-harness satin weave composite and characterizing the statistical distributions and correlations of those parameters from photomicrographs of various cross sections. In addition, realistic artificial cross sections of a 2D representative volume element (RVE) are generated reflecting the variability found in the photomicrographs, which are used to determine the effects of architectural variability on the thermo-mechanical properties. Lastly, preliminary information is obtained on the sensitivity of thermo-mechanical properties to architectural variations. Finite element analysis is used in combination with a response surface and it is shown that the present method is effective in determining the effects of architectural variability on thermo-mechanical properties.

  13. Dynamics and Statistical Mechanics of Rotating and non-Rotating Vortical Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Chjan

    Three projects were analyzed with the overall aim of developing a computational/analytical model for estimating values of the energy, angular momentum, enstrophy and total variation of fluid height at phase transitions between disordered and self-organized flow states in planetary atmospheres. It is believed that these transitions in equilibrium statistical mechanics models play a role in the construction of large-scale, stable structures including super-rotation in the Venusian atmosphere and the formation of the Great Red Spot on Jupiter. Exact solutions of the spherical energy-enstrophy models for rotating planetary atmospheres by Kac's method of steepest descent predicted phase transitions to super-rotating solid-bodymore » flows at high energy to enstrophy ratio for all planetary spins and to sub-rotating modes if the planetary spin is large enough. These canonical statistical ensembles are well-defined for the long-range energy interactions that arise from 2D fluid flows on compact oriented manifolds such as the surface of the sphere and torus. This is because in Fourier space available through Hodge theory, the energy terms are exactly diagonalizable and hence has zero range, leading to well-defined heat baths.« less

  14. Symptom Clusters in Advanced Cancer Patients: An Empirical Comparison of Statistical Methods and the Impact on Quality of Life.

    PubMed

    Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M

    2016-01-01

    Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  15. Estimating statistical uncertainty of Monte Carlo efficiency-gain in the context of a correlated sampling Monte Carlo code for brachytherapy treatment planning with non-normal dose distribution.

    PubMed

    Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr

    2012-01-01

    Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. U.S. residential consumer product information: Validation of methods for post-stratification weighting of Amazon Mechanical Turk surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenblatt, Jeffery B.; Yang, Hung-Chia; Desroches, Louis-Benoit

    2013-04-01

    We present two post-stratification weighting methods to validate survey data collected using Amazon Mechanical Turk (AMT). Two surveys focused on appliance and consumer electronics devices were administered in the spring and summer of 2012 to each of approximately 3,000 U.S. households. Specifically, the surveys asked questions about residential refrigeration products, televisions (TVs) and set-top boxes (STBs). Filtered data were assigned weights using each of two weighting methods, termed “sequential” and “simultaneous,” by examining up to eight demographic variables (income, education, gender, race, Hispanic origin, number of occupants, ages of occupants, and geographic region) in comparison to reference U.S. demographic datamore » from the 2009 Residential Energy Consumption Survey (RECS). Five key questions from the surveys (number of refrigerators, number of freezers, number of TVs, number of STBs and primary service provider) were evaluated with a set of statistical tests to determine whether either method improved the agreement of AMT with reference data, and if so, which method was better. The statistical tests used were: differences in proportions, distributions of proportions (using Pearson’s chi-squared test), and differences in average numbers of devices as functions of all demographic variables. The results indicated that both methods generally improved the agreement between AMT and reference data, sometimes greatly, but that the simultaneous method was usually superior to the sequential method. Some differences in sample populations were found between the AMT and reference data. Differences in the proportion of STBs reflected large changes in the STB market since the time our reference data was acquired in 2009. Differences in the proportions of some primary service providers suggested real sample bias, with the possible explanation that AMT user are more likely to subscribe to providers who also provide home internet service. Differences in other variables, while statistically significant in some cases, were nonetheless considered to be minor. Depending on the intended purpose of the data collected using AMT, these biases may or may not be important; to correct them, additional questions and/or further post-survey adjustments could be employed. In general, based on the analysis methods and the sample datasets used in this study, AMT surveys appeared to provide useful data on appliance and consumer electronics devices.« less

  17. Recent Development on O(+) - O Collision Frequency and Ionosphere-Thermosphere Coupling

    NASA Technical Reports Server (NTRS)

    Omidvar, K.; Menard, R.

    1999-01-01

    The collision frequency between an oxygen atom and its singly charged ion controls the momentum transfer between the ionosphere and the thermosphere. There has been a long standing discrepancy, extending over a decade, between the theoretical and empirical determination of this frequency: the empirical value of this frequency exceeded the theoretical value by a factor of 1.7. Recent improvements in theory were obtained by using accurate oxygen ion-oxygen atom potential energy curves, and partial wave quantum mechanical calculations. We now have applied three independent statistical methods to the observational data, obtained at the MIT/Millstone Hill Observatory, consisting of two sets A and B. These methods give results consistent with each other, and together with the recent theoretical improvements, bring the ratio close to unity, as it should be. The three statistical methods lead to an average for the ratio of the empirical to the theoretical values equal to 0.98, with an uncertainty of +/-8%, resolving the old discrepancy between theory and observation. The Hines statistics, and the lognormal distribution statistics, both give lower and upper bounds for the Set A equal to 0.89 and 1.02, respectively. The related bounds for the Set B are 1.06 and 1.17. The average values of these bounds thus bracket the ideal value of the ratio which should be equal to unity. The main source of uncertainties are errors in the profile of the oxygen atom density, which is of the order of 11%. An alternative method to find the oxygen atom density is being suggested.

  18. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  19. Validation of phenol red versus gravimetric method for water reabsorption correction and study of gender differences in Doluisio's absorption technique.

    PubMed

    Tuğcu-Demiröz, Fatmanur; Gonzalez-Alvarez, Isabel; Gonzalez-Alvarez, Marta; Bermejo, Marival

    2014-10-01

    The aim of the present study was to develop a method for water flux reabsorption measurement in Doluisio's Perfusion Technique based on the use of phenol red as a non-absorbable marker and to validate it by comparison with gravimetric procedure. The compounds selected for the study were metoprolol, atenolol, cimetidine and cefadroxil in order to include low, intermediate and high permeability drugs absorbed by passive diffusion and by carrier mediated mechanism. The intestinal permeabilities (Peff) of the drugs were obtained in male and female Wistar rats and calculated using both methods of water flux correction. The absorption rate coefficients of all the assayed compounds did not show statistically significant differences between male and female rats consequently all the individual values were combined to compare between reabsorption methods. The absorption rate coefficients and permeability values did not show statistically significant differences between the two strategies of concentration correction. The apparent zero order water absorption coefficients were also similar in both correction procedures. In conclusion gravimetric and phenol red method for water reabsorption correction are accurate and interchangeable for permeability estimation in closed loop perfusion method. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Quantifying Cartilage Contact Modulus, Tension Modulus, and Permeability With Hertzian Biphasic Creep

    PubMed Central

    Moore, A. C.; DeLucca, J. F.; Elliott, D. M.; Burris, D. L.

    2016-01-01

    This paper describes a new method, based on a recent analytical model (Hertzian biphasic theory (HBT)), to simultaneously quantify cartilage contact modulus, tension modulus, and permeability. Standard Hertzian creep measurements were performed on 13 osteochondral samples from three mature bovine stifles. Each creep dataset was fit for material properties using HBT. A subset of the dataset (N = 4) was also fit using Oyen's method and FEBio, an open-source finite element package designed for soft tissue mechanics. The HBT method demonstrated statistically significant sensitivity to differences between cartilage from the tibial plateau and cartilage from the femoral condyle. Based on the four samples used for comparison, no statistically significant differences were detected between properties from the HBT and FEBio methods. While the finite element method is considered the gold standard for analyzing this type of contact, the expertise and time required to setup and solve can be prohibitive, especially for large datasets. The HBT method agreed quantitatively with FEBio but also offers ease of use by nonexperts, rapid solutions, and exceptional fit quality (R2 = 0.999 ± 0.001, N = 13). PMID:27536012

  1. Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.

    PubMed

    Venturi, D; Karniadakis, G E

    2014-06-08

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.

  2. Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems

    PubMed Central

    Venturi, D.; Karniadakis, G. E.

    2014-01-01

    Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519

  3. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  4. Evaluating the Amount of Tooth Movement and Root Resorption during Canine Retraction with Friction versus Frictionless Mechanics Using Cone Beam Computed Tomography

    PubMed Central

    Makhlouf, Mohamed; Aboul–Ezz, Amr; Fayed, Mona Salah; Hafez, Hend

    2018-01-01

    BACKGROUND: The current study was carried out to compare the amount of tooth movement during canine retraction comparing two different retraction mechanics; friction mechanics represented by a NiTi closed coil spring versus frictionless mechanics represented by T - loop, and their effect on root resorption using Cone Beam Computed Tomography (CBCT). METHOD: Ten patients were selected in a split-mouth study design that had a malocclusion that necessitates the extraction of maxillary first premolars and retraction of maxillary canines. The right maxillary canines were retracted using T - loops fabricated from 0.017 X 0.025 TMA wires. The left maxillary canines received NiTi coil spring with 150 gm of retraction force. Pre retraction and post retraction Cone Beam Computed Tomography were taken to evaluate the amount of tooth movement and root resorption using three-dimensional planes. RESULTS: T - loop side showed statistically significant higher mean anteroposterior measurement than NiTi coil spring side, indicating a lower amount of canine movement pre and post a canine retraction. Concerning the root resorption, there was no statistically significant change in the mean measurements of canine root length post retraction. CONCLUSION: The NiTi coil spring side showed more distal movement more than the T-loop side. Both retraction mechanics with controlled retraction force, do not cause root resorption. PMID:29531610

  5. Resistance gene identification from Larimichthys crocea with machine learning techniques

    NASA Astrophysics Data System (ADS)

    Cai, Yinyin; Liao, Zhijun; Ju, Ying; Liu, Juan; Mao, Yong; Liu, Xiangrong

    2016-12-01

    The research on resistance genes (R-gene) plays a vital role in bioinformatics as it has the capability of coping with adverse changes in the external environment, which can form the corresponding resistance protein by transcription and translation. It is meaningful to identify and predict R-gene of Larimichthys crocea (L.Crocea). It is friendly for breeding and the marine environment as well. Large amounts of L.Crocea’s immune mechanisms have been explored by biological methods. However, much about them is still unclear. In order to break the limited understanding of the L.Crocea’s immune mechanisms and to detect new R-gene and R-gene-like genes, this paper came up with a more useful combination prediction method, which is to extract and classify the feature of available genomic data by machine learning. The effectiveness of feature extraction and classification methods to identify potential novel R-gene was evaluated, and different statistical analyzes were utilized to explore the reliability of prediction method, which can help us further understand the immune mechanisms of L.Crocea against pathogens. In this paper, a webserver called LCRG-Pred is available at http://server.malab.cn/rg_lc/.

  6. Experimental and numerical characterization of expanded glass granules

    NASA Astrophysics Data System (ADS)

    Chaudry, Mohsin Ali; Woitzik, Christian; Düster, Alexander; Wriggers, Peter

    2018-07-01

    In this paper, the material response of expanded glass granules at different scales and under different boundary conditions is investigated. At grain scale, single particle tests can be used to determine properties like Young's modulus or crushing strength. With experiments like triaxial and oedometer tests, it is possible to examine the bulk mechanical behaviour of the granular material. Our experimental investigation is complemented by a numerical simulation where the discrete element method is used to compute the mechanical behaviour of such materials. In order to improve the simulation quality, effects such as rolling resistance, inelastic behaviour, damage, and crushing are also included in the discrete element method. Furthermore, the variation of the material properties of granules is modelled by a statistical distribution and included in our numerical simulation.

  7. Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue

    NASA Astrophysics Data System (ADS)

    Kree, P.; Soize, C.

    The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.

  8. Dielectric properties of classical and quantized ionic fluids.

    PubMed

    Høye, Johan S

    2010-06-01

    We study time-dependent correlation functions of classical and quantum gases using methods of equilibrium statistical mechanics for systems of uniform as well as nonuniform densities. The basis for our approach is the path integral formalism of quantum mechanical systems. With this approach the statistical mechanics of a quantum mechanical system becomes the equivalent of a classical polymer problem in four dimensions where imaginary time is the fourth dimension. Several nontrivial results for quantum systems have been obtained earlier by this analogy. Here, we will focus upon the presence of a time-dependent electromagnetic pair interaction where the electromagnetic vector potential that depends upon currents, will be present. Thus both density and current correlations are needed to evaluate the influence of this interaction. Then we utilize that densities and currents can be expressed by polarizations by which the ionic fluid can be regarded as a dielectric one for which a nonlocal susceptibility is found. This nonlocality has as a consequence that we find no contribution from a possible transverse electric zero-frequency mode for the Casimir force between metallic plates. Further, we establish expressions for a leading correction to ab initio calculations for the energies of the quantized electrons of molecules where now retardation effects also are taken into account.

  9. Inverse statistical physics of protein sequences: a key issues review.

    PubMed

    Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin

    2018-03-01

    In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.

  10. Inverse statistical physics of protein sequences: a key issues review

    NASA Astrophysics Data System (ADS)

    Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin

    2018-03-01

    In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.

  11. Viewing health expenditures, payment and coping mechanisms with an equity lens in Nigeria

    PubMed Central

    2013-01-01

    Background This paper examines socio-economic and geographic differences in payment and payment coping mechanisms for health services in southeast Nigeria. It shows the extent to which the poor and rural dwellers disproportionally bear the burden of health care costs and offers policy recommendations for improvements. Methods Questionnaires were used to collect data from 3071 randomly selected households in six communities in southeast Nigeria using a four week recall. The sample was divided into quintiles (Q1-Q5) using a socio-economic status (SES) index as well as into geographic groups (rural, peri-urban and urban). Tabulations and logistic regression were used to determine the relationships between payment and payment coping mechanisms and key independent variables. Q1/Q5 and rural/urban ratios were the measures of equity. Results Most of the respondents used out-of-pocket spending (OOPS) and own money to pay for healthcare. There was statistically significant geographic differences in the use of own money to pay for health services indicating more use among rural dwellers. Logistic regression showed statistically significant geographic differences in the use of both OOPS and own money when controlling for the effects of potential cofounders. Conclusions This study shows statistically significant geographic differences in the use of OOPS and own money to pay for health services. Though the SES differences were not statistically significant, they showed high equity ratios indicating more use among poor and rural dwellers. The high expenditure incurred on drugs alone highlights the need for expediting pro-poor interventions like exemptions and waivers aimed at improving access to health care for the vulnerable poor and rural dwellers. PMID:23497246

  12. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  13. Studies in Non-Equilibrium Statistical Mechanics.

    DTIC Science & Technology

    1982-09-01

    in the formalism, and this is used to simulate the effects of rotational states and collisions. At each stochastic step the energy changes in the...uses of this method. 10. A Scaling Theoretical Analysis of Vibrational Relaxation Experiments: Rotational Effects and Long-Range Collisions 0...in- elude rotational effects through the rotational energy gaps and the rotational distributions. The variables in this theory are a fundamental set

  14. Chemical Potential for the Interacting Classical Gas and the Ideal Quantum Gas Obeying a Generalized Exclusion Principle

    ERIC Educational Resources Information Center

    Sevilla, F. J.; Olivares-Quiroz, L.

    2012-01-01

    In this work, we address the concept of the chemical potential [mu] in classical and quantum gases towards the calculation of the equation of state [mu] = [mu](n, T) where n is the particle density and "T" the absolute temperature using the methods of equilibrium statistical mechanics. Two cases seldom discussed in elementary textbooks are…

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallerman, G.; Gray, R.J.

    An instrument for crushing-strength determinations of uncoated and pyrolytic-carbon-coated fuel particles (50 to 500 mu in diameter) was developed to relate the crushing strength of the particles to their fabricability. The instrument consists of a loading mechanism, load cell, and a power supply-readout unit. The information that can be obtained by statistical methods of the data analysis is illustrated by results on two batches of fuel particles. (auth)

  16. Interior Noise

    NASA Technical Reports Server (NTRS)

    Mixson, John S.; Wilby, John F.

    1991-01-01

    The generation and control of flight vehicle interior noise is discussed. Emphasis is placed on the mechanisms of transmission through airborne and structure-borne paths and the control of cabin noise by path modification. Techniques for identifying the relative contributions of the various source-path combinations are also discussed along with methods for the prediction of aircraft interior noise such as those based on the general modal theory and statistical energy analysis.

  17. Numerical and Statistical Analysis of Fractures in Mechanically Dissimilar Rocks of Limestone Interbedded with Shale from Nash Point in Bristol Channel, South Wales, UK.

    NASA Astrophysics Data System (ADS)

    Adeoye-Akinde, K.; Gudmundsson, A.

    2017-12-01

    Heterogeneity and anisotropy, especially with layered strata within the same reservoir, makes the geometry and permeability of an in-situ fracture network challenging to forecast. This study looks at outcrops analogous to reservoir rocks for a better understanding of in-situ fracture networks and permeability, especially fracture formation, propagation, and arrest/deflection. Here, fracture geometry (e.g. length and aperture) from interbedded limestone and shale is combined with statistical and numerical modelling (using the Finite Element Method) to better forecast fracture network properties and permeability. The main aim is to bridge the gap between fracture data obtained at the core level (cm-scale) and at the seismic level (km-scale). Analysis has been made of geometric properties of over 250 fractures from the blue Lias in Nash Point, UK. As fractures propagate, energy is required to keep them going, and according to the laws of thermodynamics, this energy can be linked to entropy. As fractures grow, entropy increases, therefore, the result shows a strong linear correlation between entropy and the scaling exponent of fracture length and aperture-size distributions. Modelling is used to numerically simulate the stress/fracture behaviour in mechanically dissimilar rocks. Results show that the maximum principal compressive stress orientation changes in the host rock as the fracture-induced stress tip moves towards a more compliant (shale) layer. This behaviour can be related to the three mechanisms of fracture arrest/deflection at an interface, namely: elastic mismatch, stress barrier and Cook-Gordon debonding. Tensile stress concentrates at the contact between the stratigraphic layers, ahead of and around the propagating fracture. However, as shale stiffens with time, the stresses concentrated at the contact start to dissipate into it. This can happen in nature through diagenesis, and with greater depth of burial. This study also investigates how induced fractures propagate and interact with existing discontinuities in layered rocks using analogue modelling. Further work will introduce the Maximum Entropy Method for more accurate statistical modelling. This method is mainly useful to forecast likely fracture-size probability distributions from incomplete subsurface information.

  18. [Hormonal (levonorgestrel) emergency contraception--effectiveness and mechanism of action].

    PubMed

    Medard, Lech M; Ostrowska, Lucyna

    2010-07-01

    Periodic abstinence and coitus interruptus are the most popular methods of contraception in Poland. Recent studies have provided us with evidence that the so-called "menstrual calendar" may be much less effective than it was believed. In these circumstances, promotion and use of safe and truly effective contraceptives is very important for Polish women. Emergency contraception (EC) is a method which could be used even in cases when other contraception methods have failed. Mechanism of action of levonorgestrel used for EC and possible disturbances in the process of implantation of the blastocyst in the endometrium, remain the source of heated discussion among medical professionals. The latest publications provide us with evidence that the use of levonorgestrel in EC neither alters endometrial receptivity nor impedes implantation. Hormonal EC effectiveness is another hot topic of gynecological endocrinology and statistics. There is, however, no better, safer, and more ethically accepted method of preventing unwanted pregnancy for patients in need of postcoital contraception.

  19. Thermal helium clusters at 3.2 Kelvin in classical and semiclassical simulations

    NASA Astrophysics Data System (ADS)

    Schulte, J.

    1993-03-01

    The thermodynamic stability of4He4-13 at 3.2 K is investigated with the classical Monte Carlo method, with the semiclassical path-integral Monte Carlo (PIMC) method, and with the semiclassical all-order many-body method. In the all-order many-body simulation the dipole-dipole approximation including short-range correction is used. The resulting stability plots are discussed and related to recent TOF experiments by Stephens and King. It is found that with classical Monte Carlo of course the characteristics of the measured mass spectrum cannot be resolved. With PIMC, switching on more and more quantum mechanics. by raising the number of virtual time steps results in more structure in the stability plot, but this did not lead to sufficient agreement with the TOF experiment. Only the all-order many-body method resolved the characteristic structures of the measured mass spectrum, including magic numbers. The result shows the influence of quantum statistics and quantum mechanics on the stability of small neutral helium clusters.

  20. Is using multiple imputation better than complete case analysis for estimating a prevalence (risk) difference in randomized controlled trials when binary outcome observations are missing?

    PubMed

    Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian

    2016-07-22

    Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.

  1. Statistical Thermodynamics and Microscale Thermophysics

    NASA Astrophysics Data System (ADS)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  2. Appplication of statistical mechanical methods to the modeling of social networks

    NASA Astrophysics Data System (ADS)

    Strathman, Anthony Robert

    With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.

  3. Region-specific network plasticity in simulated and living cortical networks: comparison of the center of activity trajectory (CAT) with other statistics

    NASA Astrophysics Data System (ADS)

    Chao, Zenas C.; Bakkum, Douglas J.; Potter, Steve M.

    2007-09-01

    Electrically interfaced cortical networks cultured in vitro can be used as a model for studying the network mechanisms of learning and memory. Lasting changes in functional connectivity have been difficult to detect with extracellular multi-electrode arrays using standard firing rate statistics. We used both simulated and living networks to compare the ability of various statistics to quantify functional plasticity at the network level. Using a simulated integrate-and-fire neural network, we compared five established statistical methods to one of our own design, called center of activity trajectory (CAT). CAT, which depicts dynamics of the location-weighted average of spatiotemporal patterns of action potentials across the physical space of the neuronal circuitry, was the most sensitive statistic for detecting tetanus-induced plasticity in both simulated and living networks. By reducing the dimensionality of multi-unit data while still including spatial information, CAT allows efficient real-time computation of spatiotemporal activity patterns. Thus, CAT will be useful for studies in vivo or in vitro in which the locations of recording sites on multi-electrode probes are important.

  4. Nonparametric identification of nonlinear dynamic systems using a synchronisation-based method

    NASA Astrophysics Data System (ADS)

    Kenderi, Gábor; Fidlin, Alexander

    2014-12-01

    The present study proposes an identification method for highly nonlinear mechanical systems that does not require a priori knowledge of the underlying nonlinearities to reconstruct arbitrary restoring force surfaces between degrees of freedom. This approach is based on the master-slave synchronisation between a dynamic model of the system as the slave and the real system as the master using measurements of the latter. As the model synchronises to the measurements, it becomes an observer of the real system. The optimal observer algorithm in a least-squares sense is given by the Kalman filter. Using the well-known state augmentation technique, the Kalman filter can be turned into a dual state and parameter estimator to identify parameters of a priori characterised nonlinearities. The paper proposes an extension of this technique towards nonparametric identification. A general system model is introduced by describing the restoring forces as bilateral spring-dampers with time-variant coefficients, which are estimated as augmented states. The estimation procedure is followed by an a posteriori statistical analysis to reconstruct noise-free restoring force characteristics using the estimated states and their estimated variances. Observability is provided using only one measured mechanical quantity per degree of freedom, which makes this approach less demanding in the number of necessary measurement signals compared with truly nonparametric solutions, which typically require displacement, velocity and acceleration signals. Additionally, due to the statistical rigour of the procedure, it successfully addresses signals corrupted by significant measurement noise. In the present paper, the method is described in detail, which is followed by numerical examples of one degree of freedom (1DoF) and 2DoF mechanical systems with strong nonlinearities of vibro-impact type to demonstrate the effectiveness of the proposed technique.

  5. Methodological reporting of randomized trials in five leading Chinese nursing journals.

    PubMed

    Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu

    2014-01-01

    Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.

  6. DNA viewed as an out-of-equilibrium structure

    NASA Astrophysics Data System (ADS)

    Provata, A.; Nicolis, C.; Nicolis, G.

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ2 tests shows that DNA can not be described as a low order Markov chain of order up to r =6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  7. DNA viewed as an out-of-equilibrium structure.

    PubMed

    Provata, A; Nicolis, C; Nicolis, G

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ^{2} tests shows that DNA can not be described as a low order Markov chain of order up to r=6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  8. The Standard Model in the history of the Natural Sciences, Econometrics, and the social sciences

    NASA Astrophysics Data System (ADS)

    Fisher, W. P., Jr.

    2010-07-01

    In the late 18th and early 19th centuries, scientists appropriated Newton's laws of motion as a model for the conduct of any other field of investigation that would purport to be a science. This early form of a Standard Model eventually informed the basis of analogies for the mathematical expression of phenomena previously studied qualitatively, such as cohesion, affinity, heat, light, electricity, and magnetism. James Clerk Maxwell is known for his repeated use of a formalized version of this method of analogy in lectures, teaching, and the design of experiments. Economists transferring skills learned in physics made use of the Standard Model, especially after Maxwell demonstrated the value of conceiving it in abstract mathematics instead of as a concrete and literal mechanical analogy. Haavelmo's probability approach in econometrics and R. Fisher's Statistical Methods for Research Workers brought a statistical approach to bear on the Standard Model, quietly reversing the perspective of economics and the social sciences relative to that of physics. Where physicists, and Maxwell in particular, intuited scientific method as imposing stringent demands on the quality and interrelations of data, instruments, and theory in the name of inferential and comparative stability, statistical models and methods disconnected theory from data by removing the instrument as an essential component. New possibilities for reconnecting economics and the social sciences to Maxwell's sense of the method of analogy are found in Rasch's probabilistic models for measurement.

  9. New auto-segment method of cerebral hemorrhage

    NASA Astrophysics Data System (ADS)

    Wang, Weijiang; Shen, Tingzhi; Dang, Hua

    2007-12-01

    A novel method for Computerized tomography (CT) cerebral hemorrhage (CH) image automatic segmentation is presented in the paper, which uses expert system that models human knowledge about the CH automatic segmentation problem. The algorithm adopts a series of special steps and extracts some easy ignored CH features which can be found by statistic results of mass real CH images, such as region area, region CT number, region smoothness and some statistic CH region relationship. And a seven steps' extracting mechanism will ensure these CH features can be got correctly and efficiently. By using these CH features, a decision tree which models the human knowledge about the CH automatic segmentation problem has been built and it will ensure the rationality and accuracy of the algorithm. Finally some experiments has been taken to verify the correctness and reasonable of the automatic segmentation, and the good correct ratio and fast speed make it possible to be widely applied into practice.

  10. From plant traits to plant communities: a statistical mechanistic approach to biodiversity.

    PubMed

    Shipley, Bill; Vile, Denis; Garnier, Eric

    2006-11-03

    We developed a quantitative method, analogous to those used in statistical mechanics, to predict how biodiversity will vary across environments, which plant species from a species pool will be found in which relative abundances in a given environment, and which plant traits determine community assembly. This provides a scaling from plant traits to ecological communities while bypassing the complications of population dynamics. Our method treats community development as a sorting process involving species that are ecologically equivalent except with respect to particular functional traits, which leads to a constrained random assembly of species; the relative abundance of each species adheres to a general exponential distribution as a function of its traits. Using data for eight functional traits of 30 herbaceous species and community-aggregated values of these traits in 12 sites along a 42-year chronosequence of secondary succession, we predicted 94% of the variance in the relative abundances.

  11. Automated sampling assessment for molecular simulations using the effective sample size

    PubMed Central

    Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.

    2010-01-01

    To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418

  12. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning.

    PubMed

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-06-17

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults.

  13. Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning

    PubMed Central

    Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego

    2016-01-01

    Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults. PMID:27322273

  14. In silico environmental chemical science: properties and processes from statistical and computational modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tratnyek, Paul G.; Bylaska, Eric J.; Weber, Eric J.

    2017-01-01

    Quantitative structure–activity relationships (QSARs) have long been used in the environmental sciences. More recently, molecular modeling and chemoinformatic methods have become widespread. These methods have the potential to expand and accelerate advances in environmental chemistry because they complement observational and experimental data with “in silico” results and analysis. The opportunities and challenges that arise at the intersection between statistical and theoretical in silico methods are most apparent in the context of properties that determine the environmental fate and effects of chemical contaminants (degradation rate constants, partition coefficients, toxicities, etc.). The main example of this is the calibration of QSARs usingmore » descriptor variable data calculated from molecular modeling, which can make QSARs more useful for predicting property data that are unavailable, but also can make them more powerful tools for diagnosis of fate determining pathways and mechanisms. Emerging opportunities for “in silico environmental chemical science” are to move beyond the calculation of specific chemical properties using statistical models and toward more fully in silico models, prediction of transformation pathways and products, incorporation of environmental factors into model predictions, integration of databases and predictive models into more comprehensive and efficient tools for exposure assessment, and extending the applicability of all the above from chemicals to biologicals and materials.« less

  15. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    ERIC Educational Resources Information Center

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  16. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  17. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  18. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  19. A multibody knee model with discrete cartilage prediction of tibio-femoral contact mechanics.

    PubMed

    Guess, Trent M; Liu, Hongzeng; Bhashyam, Sampath; Thiagarajan, Ganesh

    2013-01-01

    Combining musculoskeletal simulations with anatomical joint models capable of predicting cartilage contact mechanics would provide a valuable tool for studying the relationships between muscle force and cartilage loading. As a step towards producing multibody musculoskeletal models that include representation of cartilage tissue mechanics, this research developed a subject-specific multibody knee model that represented the tibia plateau cartilage as discrete rigid bodies that interacted with the femur through deformable contacts. Parameters for the compliant contact law were derived using three methods: (1) simplified Hertzian contact theory, (2) simplified elastic foundation contact theory and (3) parameter optimisation from a finite element (FE) solution. The contact parameters and contact friction were evaluated during a simulated walk in a virtual dynamic knee simulator, and the resulting kinematics were compared with measured in vitro kinematics. The effects on predicted contact pressures and cartilage-bone interface shear forces during the simulated walk were also evaluated. The compliant contact stiffness parameters had a statistically significant effect on predicted contact pressures as well as all tibio-femoral motions except flexion-extension. The contact friction was not statistically significant to contact pressures, but was statistically significant to medial-lateral translation and all rotations except flexion-extension. The magnitude of kinematic differences between model formulations was relatively small, but contact pressure predictions were sensitive to model formulation. The developed multibody knee model was computationally efficient and had a computation time 283 times faster than a FE simulation using the same geometries and boundary conditions.

  20. Subcritical crack growth in SiNx thin-film barriers studied by electro-mechanical two-point bending

    NASA Astrophysics Data System (ADS)

    Guan, Qingling; Laven, Jozua; Bouten, Piet C. P.; de With, Gijsbertus

    2013-06-01

    Mechanical failure resulting from subcritical crack growth in the SiNx inorganic barrier layer applied on a flexible multilayer structure was studied by an electro-mechanical two-point bending method. A 10 nm conducting tin-doped indium oxide layer was sputtered as an electrical probe to monitor the subcritical crack growth in the 150 nm dielectric SiNx layer carried by a polyethylene naphthalate substrate. In the electro-mechanical two-point bending test, dynamic and static loads were applied to investigate the crack propagation in the barrier layer. As consequence of using two loading modes, the characteristic failure strain and failure time could be determined. The failure probability distribution of strain and lifetime under each loading condition was described by Weibull statistics. In this study, results from the tests in dynamic and static loading modes were linked by a power law description to determine the critical failure over a range of conditions. The fatigue parameter n from the power law reduces greatly from 70 to 31 upon correcting for internal strain. The testing method and analysis tool as described in the paper can be used to understand the limit of thin-film barriers in terms of their mechanical properties.

  1. Statistical Analysis on the Mechanical Properties of Magnesium Alloys

    PubMed Central

    Liu, Ruoyu; Jiang, Xianquan; Zhang, Hongju; Zhang, Dingfei; Wang, Jingfeng; Pan, Fusheng

    2017-01-01

    Knowledge of statistical characteristics of mechanical properties is very important for the practical application of structural materials. Unfortunately, the scatter characteristics of magnesium alloys for mechanical performance remain poorly understood until now. In this study, the mechanical reliability of magnesium alloys is systematically estimated using Weibull statistical analysis. Interestingly, the Weibull modulus, m, of strength for magnesium alloys is as high as that for aluminum and steels, confirming the very high reliability of magnesium alloys. The high predictability in the tensile strength of magnesium alloys represents the capability of preventing catastrophic premature failure during service, which is essential for safety and reliability assessment. PMID:29113116

  2. Interlaboratory round robin study on axial tensile properties of SiC-SiC CMC tubular test specimens [Interlaboratory round robin study on axial tensile properties of SiC/SiC tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.

    An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less

  3. Finding equilibrium in the spatiotemporal chaos of the complex Ginzburg-Landau equation

    NASA Astrophysics Data System (ADS)

    Ballard, Christopher C.; Esty, C. Clark; Egolf, David A.

    2016-11-01

    Equilibrium statistical mechanics allows the prediction of collective behaviors of large numbers of interacting objects from just a few system-wide properties; however, a similar theory does not exist for far-from-equilibrium systems exhibiting complex spatial and temporal behavior. We propose a method for predicting behaviors in a broad class of such systems and apply these ideas to an archetypal example, the spatiotemporal chaotic 1D complex Ginzburg-Landau equation in the defect chaos regime. Building on the ideas of Ruelle and of Cross and Hohenberg that a spatiotemporal chaotic system can be considered a collection of weakly interacting dynamical units of a characteristic size, the chaotic length scale, we identify underlying, mesoscale, chaotic units and effective interaction potentials between them. We find that the resulting equilibrium Takahashi model accurately predicts distributions of particle numbers. These results suggest the intriguing possibility that a class of far-from-equilibrium systems may be well described at coarse-grained scales by the well-established theory of equilibrium statistical mechanics.

  4. Interlaboratory round robin study on axial tensile properties of SiC-SiC CMC tubular test specimens [Interlaboratory round robin study on axial tensile properties of SiC/SiC tubes

    DOE PAGES

    Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.; ...

    2018-04-19

    An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less

  5. Finding equilibrium in the spatiotemporal chaos of the complex Ginzburg-Landau equation.

    PubMed

    Ballard, Christopher C; Esty, C Clark; Egolf, David A

    2016-11-01

    Equilibrium statistical mechanics allows the prediction of collective behaviors of large numbers of interacting objects from just a few system-wide properties; however, a similar theory does not exist for far-from-equilibrium systems exhibiting complex spatial and temporal behavior. We propose a method for predicting behaviors in a broad class of such systems and apply these ideas to an archetypal example, the spatiotemporal chaotic 1D complex Ginzburg-Landau equation in the defect chaos regime. Building on the ideas of Ruelle and of Cross and Hohenberg that a spatiotemporal chaotic system can be considered a collection of weakly interacting dynamical units of a characteristic size, the chaotic length scale, we identify underlying, mesoscale, chaotic units and effective interaction potentials between them. We find that the resulting equilibrium Takahashi model accurately predicts distributions of particle numbers. These results suggest the intriguing possibility that a class of far-from-equilibrium systems may be well described at coarse-grained scales by the well-established theory of equilibrium statistical mechanics.

  6. Geodesic Monte Carlo on Embedded Manifolds

    PubMed Central

    Byrne, Simon; Girolami, Mark

    2013-01-01

    Markov chain Monte Carlo methods explicitly defined on the manifold of probability distributions have recently been established. These methods are constructed from diffusions across the manifold and the solution of the equations describing geodesic flows in the Hamilton–Jacobi representation. This paper takes the differential geometric basis of Markov chain Monte Carlo further by considering methods to simulate from probability distributions that themselves are defined on a manifold, with common examples being classes of distributions describing directional statistics. Proposal mechanisms are developed based on the geodesic flows over the manifolds of support for the distributions, and illustrative examples are provided for the hypersphere and Stiefel manifold of orthonormal matrices. PMID:25309024

  7. North American Extreme Temperature Events and Related Large Scale Meteorological Patterns: A Review of Statistical Methods, Dynamics, Modeling, and Trends

    NASA Technical Reports Server (NTRS)

    Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.; hide

    2015-01-01

    The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.

  8. Prediction of high-frequency vibration transmission across coupled, periodic ribbed plates by incorporating tunneling mechanisms.

    PubMed

    Yin, Jianfei; Hopkins, Carl

    2013-04-01

    Prediction of structure-borne sound transmission on built-up structures at audio frequencies is well-suited to Statistical Energy Analysis (SEA) although the inclusion of periodic ribbed plates presents challenges. This paper considers an approach using Advanced SEA (ASEA) that can incorporate tunneling mechanisms within a statistical approach. The coupled plates used for the investigation form an L-junction comprising a periodic ribbed plate with symmetric ribs and an isotropic homogeneous plate. Experimental SEA (ESEA) is carried out with input data from Finite Element Methods (FEM). This indicates that indirect coupling is significant at high frequencies where bays on the periodic ribbed plate can be treated as individual subsystems. SEA using coupling loss factors from wave theory leads to significant underestimates in the energy of the bays when the isotropic homogeneous plate is excited. This is due to the absence of tunneling mechanisms in the SEA model. In contrast, ASEA shows close agreement with FEM and laboratory measurements. The errors incurred with SEA rapidly increase as the bays become more distant from the source subsystem. ASEA provides significantly more accurate predictions by accounting for the spatial filtering that leads to non-diffuse vibration fields on these more distant bays.

  9. Mechanical parameters and flight phase characteristics in aquatic plyometric jumping.

    PubMed

    Louder, Talin J; Searle, Cade J; Bressel, Eadric

    2016-09-01

    Plyometric jumping is a commonly prescribed method of training focused on the development of reactive strength and high-velocity concentric power. Literature suggests that aquatic plyometric training may be a low-impact, effective supplement to land-based training. The purpose of the present study was to quantify acute, biomechanical characteristics of the take-off and flight phase for plyometric movements performed in the water. Kinetic force platform data from 12 young, male adults were collected for counter-movement jumps performed on land and in water at two different immersion depths. The specificity of jumps between environmental conditions was assessed using kinetic measures, temporal characteristics, and an assessment of the statistical relationship between take-off velocity and time in the air. Greater peak mechanical power was observed for jumps performed in the water, and was influenced by immersion depth. Additionally, the data suggest that, in the water, the statistical relationship between take-off velocity and time in air is quadratic. Results highlight the potential application of aquatic plyometric training as a cross-training tool for improving mechanical power and suggest that water immersion depth and fluid drag play key roles in the specificity of the take-off phase for jumping movements performed in the water.

  10. Australian Apprentice & Trainee Statistics: Mechanical Engineering and Fabrication Trades, 1995-1999. Australian Vocational Education & Training.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    Statistics regarding Australians participating in apprenticeships and traineeships in the mechanical engineering and fabrication trades in 1995-1999 were reviewed to provide an indication of where skill shortages may be occurring or will likely occur in relation to the following occupations: mechanical engineering trades; fabrication engineering…

  11. A functional U-statistic method for association analysis of sequencing data.

    PubMed

    Jadhav, Sneha; Tong, Xiaoran; Lu, Qing

    2017-11-01

    Although sequencing studies hold great promise for uncovering novel variants predisposing to human diseases, the high dimensionality of the sequencing data brings tremendous challenges to data analysis. Moreover, for many complex diseases (e.g., psychiatric disorders) multiple related phenotypes are collected. These phenotypes can be different measurements of an underlying disease, or measurements characterizing multiple related diseases for studying common genetic mechanism. Although jointly analyzing these phenotypes could potentially increase the power of identifying disease-associated genes, the different types of phenotypes pose challenges for association analysis. To address these challenges, we propose a nonparametric method, functional U-statistic method (FU), for multivariate analysis of sequencing data. It first constructs smooth functions from individuals' sequencing data, and then tests the association of these functions with multiple phenotypes by using a U-statistic. The method provides a general framework for analyzing various types of phenotypes (e.g., binary and continuous phenotypes) with unknown distributions. Fitting the genetic variants within a gene using a smoothing function also allows us to capture complexities of gene structure (e.g., linkage disequilibrium, LD), which could potentially increase the power of association analysis. Through simulations, we compared our method to the multivariate outcome score test (MOST), and found that our test attained better performance than MOST. In a real data application, we apply our method to the sequencing data from Minnesota Twin Study (MTS) and found potential associations of several nicotine receptor subunit (CHRN) genes, including CHRNB3, associated with nicotine dependence and/or alcohol dependence. © 2017 WILEY PERIODICALS, INC.

  12. Method Designed to Respect Molecular Heterogeneity Can Profoundly Correct Present Data Interpretations for Genome-Wide Expression Analysis

    PubMed Central

    Chen, Chih-Hao; Hsu, Chueh-Lin; Huang, Shih-Hao; Chen, Shih-Yuan; Hung, Yi-Lin; Chen, Hsiao-Rong; Wu, Yu-Chung

    2015-01-01

    Although genome-wide expression analysis has become a routine tool for gaining insight into molecular mechanisms, extraction of information remains a major challenge. It has been unclear why standard statistical methods, such as the t-test and ANOVA, often lead to low levels of reproducibility, how likely applying fold-change cutoffs to enhance reproducibility is to miss key signals, and how adversely using such methods has affected data interpretations. We broadly examined expression data to investigate the reproducibility problem and discovered that molecular heterogeneity, a biological property of genetically different samples, has been improperly handled by the statistical methods. Here we give a mathematical description of the discovery and report the development of a statistical method, named HTA, for better handling molecular heterogeneity. We broadly demonstrate the improved sensitivity and specificity of HTA over the conventional methods and show that using fold-change cutoffs has lost much information. We illustrate the especial usefulness of HTA for heterogeneous diseases, by applying it to existing data sets of schizophrenia, bipolar disorder and Parkinson’s disease, and show it can abundantly and reproducibly uncover disease signatures not previously detectable. Based on 156 biological data sets, we estimate that the methodological issue has affected over 96% of expression studies and that HTA can profoundly correct 86% of the affected data interpretations. The methodological advancement can better facilitate systems understandings of biological processes, render biological inferences that are more reliable than they have hitherto been and engender translational medical applications, such as identifying diagnostic biomarkers and drug prediction, which are more robust. PMID:25793610

  13. Neuroimaging in epilepsy.

    PubMed

    Sidhu, Meneka Kaur; Duncan, John S; Sander, Josemir W

    2018-05-17

    Epilepsy neuroimaging is important for detecting the seizure onset zone, predicting and preventing deficits from surgery and illuminating mechanisms of epileptogenesis. An aspiration is to integrate imaging and genetic biomarkers to enable personalized epilepsy treatments. The ability to detect lesions, particularly focal cortical dysplasia and hippocampal sclerosis, is increased using ultra high-field imaging and postprocessing techniques such as automated volumetry, T2 relaxometry, voxel-based morphometry and surface-based techniques. Statistical analysis of PET and single photon emission computer tomography (STATISCOM) are superior to qualitative analysis alone in identifying focal abnormalities in MRI-negative patients. These methods have also been used to study mechanisms of epileptogenesis and pharmacoresistance.Recent language fMRI studies aim to localize, and also lateralize language functions. Memory fMRI has been recommended to lateralize mnemonic function and predict outcome after surgery in temporal lobe epilepsy. Combinations of structural, functional and post-processing methods have been used in multimodal and machine learning models to improve the identification of the seizure onset zone and increase understanding of mechanisms underlying structural and functional aberrations in epilepsy.

  14. Crustal deformation at the terminal stage before earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Chen, C. H.; Meng, G.; Su, X.

    2016-12-01

    GPS data retrieved from 300 stations in China are used in this work to study stressed areas during earthquake preparation periods. Surface deformation data are derived by using the standard method and are smoothed by a temporal moving to mitigate influence from noise. A statistical method is used to distinguish significant variations from the smoothed data. The spatial distributions comprised of those significant variations show that a diameter of a stressed area preparing earthquakes is about 3500 km for a M6 event. The deformation deduced from the significant variations is highly related with the slip direction of the fault plane determined through the focal mechanism solution of earthquakes. Although the causal mechanism of such large stressed areas with rapid changes is not fully understood, the analytical results suggest that the earthquake preparation would be one of the factors dominating the common mode error in GPS studies. Mechanisms and/or numerical models of some pre-earthquake anomalous phenomena would be reconsidered based on this novel observation.

  15. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  16. Quantum mechanics/coarse-grained molecular mechanics (QM/CG-MM)

    NASA Astrophysics Data System (ADS)

    Sinitskiy, Anton V.; Voth, Gregory A.

    2018-01-01

    Numerous molecular systems, including solutions, proteins, and composite materials, can be modeled using mixed-resolution representations, of which the quantum mechanics/molecular mechanics (QM/MM) approach has become the most widely used. However, the QM/MM approach often faces a number of challenges, including the high cost of repetitive QM computations, the slow sampling even for the MM part in those cases where a system under investigation has a complex dynamics, and a difficulty in providing a simple, qualitative interpretation of numerical results in terms of the influence of the molecular environment upon the active QM region. In this paper, we address these issues by combining QM/MM modeling with the methodology of "bottom-up" coarse-graining (CG) to provide the theoretical basis for a systematic quantum-mechanical/coarse-grained molecular mechanics (QM/CG-MM) mixed resolution approach. A derivation of the method is presented based on a combination of statistical mechanics and quantum mechanics, leading to an equation for the effective Hamiltonian of the QM part, a central concept in the QM/CG-MM theory. A detailed analysis of different contributions to the effective Hamiltonian from electrostatic, induction, dispersion, and exchange interactions between the QM part and the surroundings is provided, serving as a foundation for a potential hierarchy of QM/CG-MM methods varying in their accuracy and computational cost. A relationship of the QM/CG-MM methodology to other mixed resolution approaches is also discussed.

  17. Quantum mechanics/coarse-grained molecular mechanics (QM/CG-MM).

    PubMed

    Sinitskiy, Anton V; Voth, Gregory A

    2018-01-07

    Numerous molecular systems, including solutions, proteins, and composite materials, can be modeled using mixed-resolution representations, of which the quantum mechanics/molecular mechanics (QM/MM) approach has become the most widely used. However, the QM/MM approach often faces a number of challenges, including the high cost of repetitive QM computations, the slow sampling even for the MM part in those cases where a system under investigation has a complex dynamics, and a difficulty in providing a simple, qualitative interpretation of numerical results in terms of the influence of the molecular environment upon the active QM region. In this paper, we address these issues by combining QM/MM modeling with the methodology of "bottom-up" coarse-graining (CG) to provide the theoretical basis for a systematic quantum-mechanical/coarse-grained molecular mechanics (QM/CG-MM) mixed resolution approach. A derivation of the method is presented based on a combination of statistical mechanics and quantum mechanics, leading to an equation for the effective Hamiltonian of the QM part, a central concept in the QM/CG-MM theory. A detailed analysis of different contributions to the effective Hamiltonian from electrostatic, induction, dispersion, and exchange interactions between the QM part and the surroundings is provided, serving as a foundation for a potential hierarchy of QM/CG-MM methods varying in their accuracy and computational cost. A relationship of the QM/CG-MM methodology to other mixed resolution approaches is also discussed.

  18. Development of a Titanium Plate for Mandibular Angle Fractures with a Bone Defect in the Lower Border: Finite Element Analysis and Mechanical Test

    PubMed Central

    Goulart, Douglas Rangel; Kemmoku, Daniel Takanori; Noritomi, Pedro Yoshito

    2015-01-01

    ABSTRACT Objectives The aim of the present study was to develop a plate to treat mandibular angle fractures using the finite element method and mechanical testing. Material and Methods A three-dimensional model of a fractured mandible was generated using Rhinoceros 4.0 software. The models were exported to ANSYS®, in which a static application of displacement (3 mm) was performed in the first molar region. Three groups were assessed according to the method of internal fixation (2 mm system): two non-locking plates; two locking plates and a new design locking plate. The computational model was transferred to an in vitro experiment with polyurethane mandibles. Each group contained five samples and was subjected to a linear loading test in a universal testing machine. Results A balanced distribution of stress was associated with the new plate design. This plate modified the mechanical behavior of the fractured region, with less displacement between the fractured segments. In the mechanical test, the group with two locking plates exhibited greater resistance to the 3 mm displacement, with a statistically significant difference when compared with the new plate group (ANOVA, P = 0.016). Conclusions The new plate exhibited a more balanced distribution of stress. However, the group with two locking plates exhibited greater mechanical resistance. PMID:26539287

  19. Extended Kalman filtering for the detection of damage in linear mechanical structures

    NASA Astrophysics Data System (ADS)

    Liu, X.; Escamilla-Ambrosio, P. J.; Lieven, N. A. J.

    2009-09-01

    This paper addresses the problem of assessing the location and extent of damage in a vibrating structure by means of vibration measurements. Frequency domain identification methods (e.g. finite element model updating) have been widely used in this area while time domain methods such as the extended Kalman filter (EKF) method, are more sparsely represented. The difficulty of applying EKF in mechanical system damage identification and localisation lies in: the high computational cost, the dependence of estimation results on the initial estimation error covariance matrix P(0), the initial value of parameters to be estimated, and on the statistics of measurement noise R and process noise Q. To resolve these problems in the EKF, a multiple model adaptive estimator consisting of a bank of EKF in modal domain was designed, each filter in the bank is based on different P(0). The algorithm was iterated by using the weighted global iteration method. A fuzzy logic model was incorporated in each filter to estimate the variance of the measurement noise R. The application of the method is illustrated by simulated and real examples.

  20. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    PubMed

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  1. Comparison of Animal Discs Used in Disc Research to Human Lumbar Disc: Torsion Mechanics and Collagen Content

    PubMed Central

    Showalter, Brent L.; Beckstein, Jesse C.; Martin, John T.; Beattie, Elizabeth E.; Orías, Alejandro A. Espinoza; Schaer, Thomas P.; Vresilovic, Edward J.; Elliott, Dawn M.

    2012-01-01

    Study Design Experimental measurement and normalization of in vitro disc torsion mechanics and collagen content for several animal species used in intervertebral disc research and comparing these to the human disc. Objective To aid in the selection of appropriate animal models for disc research by measuring torsional mechanical properties and collagen content. Summary of Background Data There is lack of data and variability in testing protocols for comparing animal and human disc torsion mechanics and collagen content. Methods Intervertebral disc torsion mechanics were measured and normalized by disc height and polar moment of inertia for 11 disc types in 8 mammalian species: the calf, pig, baboon, goat, sheep, rabbit, rat, and mouse lumbar, and cow, rat, and mouse caudal. Collagen content was measured and normalized by dry weight for the same discs except the rat and mouse. Collagen fiber stretch in torsion was calculated using an analytical model. Results Measured torsion parameters varied by several orders of magnitude across the different species. After geometric normalization, only the sheep and pig discs were statistically different from human. Fiber stretch was found to be highly dependent on the assumed initial fiber angle. The collagen content of the discs was similar, especially in the outer annulus where only the calf and goat discs were statistically different from human. Disc collagen content did not correlate with torsion mechanics. Conclusion Disc torsion mechanics are comparable to human lumbar discs in 9 of 11 disc types after normalization by geometry. The normalized torsion mechanics and collagen content of the multiple animal discs presented is useful for selecting and interpreting results for animal models of the disc. Structural composition of the disc, such as initial fiber angle, may explain the differences that were noted between species after geometric normalization. PMID:22333953

  2. Composite Materials Handbook. Volume 1. Polymer Matrix Composites Guidelines for Characterization of Structural Materials

    DTIC Science & Technology

    2002-06-17

    power law type (References 6.8.6.1(h) and (i)). Various attempts have been made to use fracture mechanics based methods for predicting failure of...participate in the MIL-HDBK-17 coordination activity . 7. All information and data contained in this handbook have been coordinated with industry and the U.S...for statistically- based properties ............................. 6 2.2.3 Issues of data equivalence

  3. Establishment of a Uniform Format for Data Reporting of Structural Material Properties for Reliability Analysis

    DTIC Science & Technology

    1994-06-30

    tip Opening Displacement (CTOD) Fracture Toughness Measurement". 48 The method has found application in the elastic-plastic fracture mechanics ( EPFM ...68 6.1 Proposed Material Property Database Format and Hierarchy .............. 68 6.2 Sample Application of the Material Property Database...the E 49.05 sub-committee. The relevant quality indicators applicable to the present program are: source of data, statistical basis of data

  4. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  5. Temperature and magnetic-field driven dynamics in artificial magnetic square ice

    DOE PAGES

    Drouhin, Henri-Jean; Wegrowe, Jean-Eric; Razeghi, Manijeh; ...

    2015-09-08

    Artificial spin ices are often spoken of as being realisations of some of the celebrated vertex models of statistical mechanics, where the exact microstate of the system can be imaged using advanced magnetic microscopy methods. The fact that a stable image can be formed means that the system is in fact athermal and not undergoing the usual finite-temperature fluctuations of a statistical mechanical system. In this paper we report on the preparation of artificial spin ices with islands that are thermally fluctuating due to their very small size. The relaxation rate of these islands was determined using variable frequency focusedmore » magneto-optic Kerr measurements. We performed magnetic imaging of artificial spin ice under varied temperature and magnetic field using X-ray transmission microscopy which uses X-ray magnetic circular dichroism to generate magnetic contrast. Furthermore, we have developed an on-membrane heater in order to apply temperatures in excess of 700 K and have shown increased dynamics due to higher temperature. Due to the ‘photon-in, photon-out' method employed here, it is the first report where it is possible to image the microstates of an ASI system under the simultaneous application of temperature and magnetic field, enabling the determination of relaxation rates, coercivties, and the analysis of vertex population during reversal.« less

  6. Temperature and magnetic-field driven dynamics in artificial magnetic square ice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drouhin, Henri-Jean; Wegrowe, Jean-Eric; Razeghi, Manijeh

    Artificial spin ices are often spoken of as being realisations of some of the celebrated vertex models of statistical mechanics, where the exact microstate of the system can be imaged using advanced magnetic microscopy methods. The fact that a stable image can be formed means that the system is in fact athermal and not undergoing the usual finite-temperature fluctuations of a statistical mechanical system. In this paper we report on the preparation of artificial spin ices with islands that are thermally fluctuating due to their very small size. The relaxation rate of these islands was determined using variable frequency focusedmore » magneto-optic Kerr measurements. We performed magnetic imaging of artificial spin ice under varied temperature and magnetic field using X-ray transmission microscopy which uses X-ray magnetic circular dichroism to generate magnetic contrast. Furthermore, we have developed an on-membrane heater in order to apply temperatures in excess of 700 K and have shown increased dynamics due to higher temperature. Due to the ‘photon-in, photon-out' method employed here, it is the first report where it is possible to image the microstates of an ASI system under the simultaneous application of temperature and magnetic field, enabling the determination of relaxation rates, coercivties, and the analysis of vertex population during reversal.« less

  7. The boundary is mixed

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo

    2017-08-01

    We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.

  8. Statistical methods used in articles published by the Journal of Periodontal and Implant Science.

    PubMed

    Choi, Eunsil; Lyu, Jiyoung; Park, Jinyoung; Kim, Hae-Young

    2014-12-01

    The purposes of this study were to assess the trend of use of statistical methods including parametric and nonparametric methods and to evaluate the use of complex statistical methodology in recent periodontal studies. This study analyzed 123 articles published in the Journal of Periodontal & Implant Science (JPIS) between 2010 and 2014. Frequencies and percentages were calculated according to the number of statistical methods used, the type of statistical method applied, and the type of statistical software used. Most of the published articles considered (64.4%) used statistical methods. Since 2011, the percentage of JPIS articles using statistics has increased. On the basis of multiple counting, we found that the percentage of studies in JPIS using parametric methods was 61.1%. Further, complex statistical methods were applied in only 6 of the published studies (5.0%), and nonparametric statistical methods were applied in 77 of the published studies (38.9% of a total of 198 studies considered). We found an increasing trend towards the application of statistical methods and nonparametric methods in recent periodontal studies and thus, concluded that increased use of complex statistical methodology might be preferred by the researchers in the fields of study covered by JPIS.

  9. Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Sugiyama, Masaru

    . Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.

  10. Learning Predictive Statistics: Strategies and Brain Mechanisms.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-08-30

    When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to changes in the environment's statistics. We provide evidence for an alternate route for learning complex temporal statistics: extracting the most probable outcome in a given context is implemented by interactions between executive and motor corticostriatal mechanisms compared with visual corticostriatal circuits (including hippocampal cortex) that support learning of the exact temporal statistics. Copyright © 2017 Wang et al.

  11. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  12. Six new mechanics corresponding to further shape theories

    NASA Astrophysics Data System (ADS)

    Anderson, Edward

    2016-02-01

    In this paper, suite of relational notions of shape are presented at the level of configuration space geometry, with corresponding new theories of shape mechanics and shape statistics. These further generalize two quite well known examples: (i) Kendall’s (metric) shape space with his shape statistics and Barbour’s mechanics thereupon. (ii) Leibnizian relational space alias metric scale-and-shape space to which corresponds Barbour-Bertotti mechanics. This paper’s new theories include, using the invariant and group namings, (iii) Angle alias conformal shape mechanics. (iv) Area ratio alias e shape mechanics. (v) Area alias e scale-and-shape mechanics. (iii)-(v) rest respectively on angle space, area-ratio space, and area space configuration spaces. Probability and statistics applications are also pointed to in outline. (vi) Various supersymmetric counterparts of (i)-(v) are considered. Since supergravity differs considerably from GR-based conceptions of background independence, some of the new supersymmetric shape mechanics are compared with both. These reveal compatibility between supersymmetry and GR-based conceptions of background independence, at least within these simpler model arenas.

  13. Generalized memory associativity in a network model for the neuroses

    NASA Astrophysics Data System (ADS)

    Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.

    2009-03-01

    We review concepts introduced in earlier work, where a neural network mechanism describes some mental processes in neurotic pathology and psychoanalytic working-through, as associative memory functioning, according to the findings of Freud. We developed a complex network model, where modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's idea that consciousness is related to symbolic and linguistic memory activity in the brain. We have introduced a generalization of the Boltzmann machine to model memory associativity. Model behavior is illustrated with simulations and some of its properties are analyzed with methods from statistical mechanics.

  14. Bootstrapping on Undirected Binary Networks Via Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Fushing, Hsieh; Chen, Chen; Liu, Shan-Yu; Koehl, Patrice

    2014-09-01

    We propose a new method inspired from statistical mechanics for extracting geometric information from undirected binary networks and generating random networks that conform to this geometry. In this method an undirected binary network is perceived as a thermodynamic system with a collection of permuted adjacency matrices as its states. The task of extracting information from the network is then reformulated as a discrete combinatorial optimization problem of searching for its ground state. To solve this problem, we apply multiple ensembles of temperature regulated Markov chains to establish an ultrametric geometry on the network. This geometry is equipped with a tree hierarchy that captures the multiscale community structure of the network. We translate this geometry into a Parisi adjacency matrix, which has a relative low energy level and is in the vicinity of the ground state. The Parisi adjacency matrix is then further optimized by making block permutations subject to the ultrametric geometry. The optimal matrix corresponds to the macrostate of the original network. An ensemble of random networks is then generated such that each of these networks conforms to this macrostate; the corresponding algorithm also provides an estimate of the size of this ensemble. By repeating this procedure at different scales of the ultrametric geometry of the network, it is possible to compute its evolution entropy, i.e. to estimate the evolution of its complexity as we move from a coarse to a fine description of its geometric structure. We demonstrate the performance of this method on simulated as well as real data networks.

  15. Scanning electron microscopic evaluation of the influence of manual and mechanical glide path on the surface of nickel-titanium rotary instruments in moderately curved root canals: An in-vivo study

    PubMed Central

    Patel, Dishant; Bashetty, Kusum; Srirekha, A.; Archana, S.; Savitha, B.; Vijay, R.

    2016-01-01

    Aim: The aim of this study was to evaluate the influence of manual versus mechanical glide path (GP) on the surface changes of two different nickel-titanium rotary instruments used during root canal therapy in a moderately curved root canal. Materials and Methods: Sixty systemically healthy controls were selected for the study. Controls were divided randomly into four groups: Group 1: Manual GP followed by RaCe rotary instruments, Group 2: Manual GP followed by HyFlex rotary instruments, Group 3: Mechanical GP followed by RaCe rotary instruments, Group 4: Mechanical GP followed by HyFlex rotary instruments. After access opening, GP was prepared and rotary instruments were used according to manufacturer's instructions. All instruments were evaluated for defects under standard error mean before their use and after a single use. The scorings for the files were given at apical and middle third. Statistical Analysis Used: Chi-squared test was used. Results: The results showed that there is no statistical difference between any of the groups. Irrespective of the GP and rotary files used, more defects were present in the apical third when compared to middle third of the rotary instrument. Conclusion: Within the limitations of this study, it can be concluded that there was no effect of manual or mechanical GP on surface defects of subsequent rotary file system used. PMID:27994317

  16. Statistical mechanics of binary mixture adsorption in metal-organic frameworks in the osmotic ensemble.

    PubMed

    Dunne, Lawrence J; Manos, George

    2018-03-13

    Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO 2 and CH 4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO 2 and CH 4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes.This article is part of the theme issue 'Modern theoretical chemistry'. © 2018 The Author(s).

  17. Statistical mechanics of binary mixture adsorption in metal-organic frameworks in the osmotic ensemble

    NASA Astrophysics Data System (ADS)

    Dunne, Lawrence J.; Manos, George

    2018-03-01

    Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO2 and CH4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO2 and CH4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes. This article is part of the theme issue `Modern theoretical chemistry'.

  18. Assessment of the mechanics of a tissue-engineered rat trachea in an image-processing environment.

    PubMed

    Silva, Thiago Henrique Gomes da; Pazetti, Rogerio; Aoki, Fabio Gava; Cardoso, Paulo Francisco Guerreiro; Valenga, Marcelo Henrique; Deffune, Elenice; Evaristo, Thaiane; Pêgo-Fernandes, Paulo Manuel; Moriya, Henrique Takachi

    2014-07-01

    Despite the recent success regarding the transplantation of tissue-engineered airways, the mechanical properties of these grafts are not well understood. Mechanical assessment of a tissue-engineered airway graft before implantation may be used in the future as a predictor of function. The aim of this preliminary work was to develop a noninvasive image-processing environment for the assessment of airway mechanics. Decellularized, recellularized and normal tracheas (groups DECEL, RECEL, and CONTROL, respectively) immersed in Krebs-Henseleit solution were ventilated by a small-animal ventilator connected to a Fleisch pneumotachograph and two pressure transducers (differential and gauge). A camera connected to a stereomicroscope captured images of the pulsation of the trachea before instillation of saline solution and after instillation of Krebs-Henseleit solution, followed by instillation with Krebs-Henseleit with methacholine 0.1 M (protocols A, K and KMCh, respectively). The data were post-processed with computer software and statistical comparisons between groups and protocols were performed. There were statistically significant variations in the image measurements of the medial region of the trachea between the groups (two-way analysis of variance [ANOVA], p<0.01) and of the proximal region between the groups and protocols (two-way ANOVA, p<0.01). The technique developed in this study is an innovative method for performing a mechanical assessment of engineered tracheal grafts that will enable evaluation of the viscoelastic properties of neo-tracheas prior to transplantation.

  19. A Bayesian perspective on Markovian dynamics and the fluctuation theorem

    NASA Astrophysics Data System (ADS)

    Virgo, Nathaniel

    2013-08-01

    One of E. T. Jaynes' most important achievements was to derive statistical mechanics from the maximum entropy (MaxEnt) method. I re-examine a relatively new result in statistical mechanics, the Evans-Searles fluctuation theorem, from a MaxEnt perspective. This is done in the belief that interpreting such results in Bayesian terms will lead to new advances in statistical physics. The version of the fluctuation theorem that I will discuss applies to discrete, stochastic systems that begin in a non-equilibrium state and relax toward equilibrium. I will show that for such systems the fluctuation theorem can be seen as a consequence of the fact that the equilibrium distribution must obey the property of detailed balance. Although the principle of detailed balance applies only to equilibrium ensembles, it puts constraints on the form of non-equilibrium trajectories. This will be made clear by taking a novel kind of Bayesian perspective, in which the equilibrium distribution is seen as a prior over the system's set of possible trajectories. Non-equilibrium ensembles are calculated from this prior using Bayes' theorem, with the initial conditions playing the role of the data. I will also comment on the implications of this perspective for the question of how to derive the second law.

  20. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    PubMed

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  1. North American extreme temperature events and related large scale meteorological patterns: A review of statistical methods, dynamics, modeling, and trends

    DOE PAGES

    Grotjahn, Richard; Black, Robert; Leung, Ruby; ...

    2015-05-22

    This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less

  2. Statistical mechanics in the context of special relativity.

    PubMed

    Kaniadakis, G

    2002-11-01

    In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of the ordinary statistical mechanics and is suitable to describe a very large class of experimentally observed phenomena in low and high energy physics and in natural, economic, and social sciences. Finally, in order to test the correctness and predictability of the theory, as working example we consider the cosmic rays spectrum, which spans 13 decades in energy and 33 decades in flux, finding a high quality agreement between our predictions and observed data.

  3. Infant Statistical Learning

    PubMed Central

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  4. Plackett-Burman experimental design to facilitate syntactic foam development

    DOE PAGES

    Smith, Zachary D.; Keller, Jennie R.; Bello, Mollie; ...

    2015-09-14

    The use of an eight-experiment Plackett–Burman method can assess six experimental variables and eight responses in a polysiloxane-glass microsphere syntactic foam. The approach aims to decrease the time required to develop a tunable polymer composite by identifying a reduced set of variables and responses suitable for future predictive modeling. The statistical design assesses the main effects of mixing process parameters, polymer matrix composition, microsphere density and volume loading, and the blending of two grades of microspheres, using a dummy factor in statistical calculations. Responses cover rheological, physical, thermal, and mechanical properties. The cure accelerator content of the polymer matrix andmore » the volume loading of the microspheres have the largest effects on foam properties. These factors are the most suitable for controlling the gel point of the curing foam, and the density of the cured foam. The mixing parameters introduce widespread variability and therefore should be fixed at effective levels during follow-up testing. Some responses may require greater contrast in microsphere-related factors. As a result, compared to other possible statistical approaches, the run economy of the Plackett–Burman method makes it a valuable tool for rapidly characterizing new foams.« less

  5. Order statistics inference for describing topological coupling and mechanical symmetry breaking in multidomain proteins

    NASA Astrophysics Data System (ADS)

    Kononova, Olga; Jones, Lee; Barsegov, V.

    2013-09-01

    Cooperativity is a hallmark of proteins, many of which show a modular architecture comprising discrete structural domains. Detecting and describing dynamic couplings between structural regions is difficult in view of the many-body nature of protein-protein interactions. By utilizing the GPU-based computational acceleration, we carried out simulations of the protein forced unfolding for the dimer WW - WW of the all-β-sheet WW domains used as a model multidomain protein. We found that while the physically non-interacting identical protein domains (WW) show nearly symmetric mechanical properties at low tension, reflected, e.g., in the similarity of their distributions of unfolding times, these properties become distinctly different when tension is increased. Moreover, the uncorrelated unfolding transitions at a low pulling force become increasingly more correlated (dependent) at higher forces. Hence, the applied force not only breaks "the mechanical symmetry" but also couples the physically non-interacting protein domains forming a multi-domain protein. We call this effect "the topological coupling." We developed a new theory, inspired by order statistics, to characterize protein-protein interactions in multi-domain proteins. The method utilizes the squared-Gaussian model, but it can also be used in conjunction with other parametric models for the distribution of unfolding times. The formalism can be taken to the single-molecule experimental lab to probe mechanical cooperativity and domain communication in multi-domain proteins.

  6. Treatment of missing data in follow-up studies of randomised controlled trials: A systematic review of the literature.

    PubMed

    Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-08-01

    After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.

  7. Effect of different aging methods on the mechanical behavior of multi-layered ceramic structures.

    PubMed

    Borba, Márcia; de Araújo, Maico D; Fukushima, Karen A; Yoshimura, Humberto N; Griggs, Jason A; Della Bona, Álvaro; Cesar, Paulo F

    2016-12-01

    To evaluate the effect of two aging methods (mechanical cycling and autoclave) on the mechanical behavior of veneer and framework ceramic specimens with different configurations (monolithic, two and three-layers). Three ceramics used as framework for fixed dental prostheses (YZ-Vita In-Ceram YZ; IZ-Vita In-Ceram Zirconia; AL-Vita In-Ceram AL) and two veneering porcelains (VM7 and VM9) were studied. Bar-shaped specimens were produced in three different designs: monolithic, two layers (porcelain-framework) and three layers (porcelain-framework-porcelain). Specimens were tested for three-point flexural strength at 1MPa/s in 37°C artificial saliva. Three different experimental conditions were evaluated (n=10): control; mechanical cycling (2Hz, 37°C artificial saliva); and autoclave aging (134°C, 2 bars, 5h). Bi-layered specimens were tested in both conditions: with porcelain or framework ceramic under tension. Fracture surfaces were analyzed using stereomicroscope and scanning electron microscopy. Results were statistically analyzed using Kruskal-Wallis and Student-Newman-Keuls tests. Only for AL group, mechanical cycling and autoclave aging significantly decreased the flexural strength values in comparison to the control (p<0.01). YZ, AL, VM7 and VM9 monolithic groups showed no strength degradation. For multi-layered specimens, when the porcelain layer was tested in tension (bi and tri-layers), the aging methods evaluated also had no effect on strength (p≥0.05). Total and partial failure modes were identified. Mechanical cycling and autoclave aging protocols had no effect on the flexural strength values and failure behavior of YZ and IZ ceramic structures. Yet, AL monolithic structures showed a significant decrease in flexural strength with any of the aging methods. Copyright © 2016. Published by Elsevier Ltd.

  8. Applications of Principled Search Methods in Climate Influences and Mechanisms

    NASA Technical Reports Server (NTRS)

    Glymour, Clark

    2005-01-01

    Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.

  9. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    NASA Astrophysics Data System (ADS)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  10. The theoretical and experimental study of a material structure evolution in gigacyclic fatigue regime

    NASA Astrophysics Data System (ADS)

    Plekhov, Oleg; Naimark, Oleg; Narykova, Maria; Kadomtsev, Andrey; Betekhtin, Vladimir

    2015-10-01

    The work is devoted to the study of the metal structure evolution under gigacyclic fatigue (VHCF) regime. The study of the mechanical properties of the samples (Armco iron) with different state of life time existing was carried out on the base of the acoustic resonance method. The damage accumulation (porosity of the samples) was studied by the hydrostatic weighing method. A statistical model of damage accumulation was proposed in order to describe the damage accumulation process. The model describes the influence of the sample surface on the location of fatigue crack initiation.

  11. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  12. From Mechanical Motion to Brownian Motion, Thermodynamics and Particle Transport Theory

    ERIC Educational Resources Information Center

    Bringuier, E.

    2008-01-01

    The motion of a particle in a medium is dealt with either as a problem of mechanics or as a transport process in non-equilibrium statistical physics. The two kinds of approach are often unrelated as they are taught in different textbooks. The aim of this paper is to highlight the link between the mechanical and statistical treatments of particle…

  13. Developing the WCRF International/University of Bristol Methodology for Identifying and Carrying Out Systematic Reviews of Mechanisms of Exposure-Cancer Associations.

    PubMed

    Lewis, Sarah J; Gardner, Mike; Higgins, Julian; Holly, Jeff M P; Gaunt, Tom R; Perks, Claire M; Turner, Suzanne D; Rinaldi, Sabina; Thomas, Steve; Harrison, Sean; Lennon, Rosie J; Tan, Vanessa; Borwick, Cath; Emmett, Pauline; Jeffreys, Mona; Northstone, Kate; Mitrou, Giota; Wiseman, Martin; Thompson, Rachel; Martin, Richard M

    2017-11-01

    Background: Human, animal, and cell experimental studies; human biomarker studies; and genetic studies complement epidemiologic findings and can offer insights into biological plausibility and pathways between exposure and disease, but methods for synthesizing such studies are lacking. We, therefore, developed a methodology for identifying mechanisms and carrying out systematic reviews of mechanistic studies that underpin exposure-cancer associations. Methods: A multidisciplinary team with expertise in informatics, statistics, epidemiology, systematic reviews, cancer biology, and nutrition was assembled. Five 1-day workshops were held to brainstorm ideas; in the intervening periods we carried out searches and applied our methods to a case study to test our ideas. Results: We have developed a two-stage framework, the first stage of which is designed to identify mechanisms underpinning a specific exposure-disease relationship; the second stage is a targeted systematic review of studies on a specific mechanism. As part of the methodology, we also developed an online tool for text mining for mechanism prioritization (TeMMPo) and a new graph for displaying related but heterogeneous data from epidemiologic studies (the Albatross plot). Conclusions: We have developed novel tools for identifying mechanisms and carrying out systematic reviews of mechanistic studies of exposure-disease relationships. In doing so, we have outlined how we have overcome the challenges that we faced and provided researchers with practical guides for conducting mechanistic systematic reviews. Impact: The aforementioned methodology and tools will allow potential mechanisms to be identified and the strength of the evidence underlying a particular mechanism to be assessed. Cancer Epidemiol Biomarkers Prev; 26(11); 1667-75. ©2017 AACR . ©2017 American Association for Cancer Research.

  14. Unfolding single RNA molecules: bridging the gap between equilibrium and non-equilibrium statistical thermodynamics.

    PubMed

    Bustamante, Carlos

    2005-11-01

    During the last 15 years, scientists have developed methods that permit the direct mechanical manipulation of individual molecules. Using this approach, they have begun to investigate the effect of force and torque in chemical and biochemical reactions. These studies span from the study of the mechanical properties of macromolecules, to the characterization of molecular motors, to the mechanical unfolding of individual proteins and RNA. Here I present a review of some of our most recent results using mechanical force to unfold individual molecules of RNA. These studies make it possible to follow in real time the trajectory of each molecule as it unfolds and characterize the various intermediates of the reaction. Moreover, if the process takes place reversibly it is possible to extract both kinetic and thermodynamic information from these experiments at the same time that we characterize the forces that maintain the three-dimensional structure of the molecule in solution. These studies bring us closer to the biological unfolding processes in the cell as they simulate in vitro, the mechanical unfolding of RNAs carried out in the cell by helicases. If the unfolding process occurs irreversibly, I show here that single-molecule experiments can still provide equilibrium, thermodynamic information from non-equilibrium data by using recently discovered fluctuation theorems. Such theorems represent a bridge between equilibrium and non-equilibrium statistical mechanics. In fact, first derived in 1997, the first experimental demonstration of the validity of fluctuation theorems was obtained by unfolding mechanically a single molecule of RNA. It is perhaps a sign of the times that important physical results are these days used to extract information about biological systems and that biological systems are being used to test and confirm fundamental new laws in physics.

  15. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  16. KECSA-Movable Type Implicit Solvation Model (KMTISM)

    PubMed Central

    2015-01-01

    Computation of the solvation free energy for chemical and biological processes has long been of significant interest. The key challenges to effective solvation modeling center on the choice of potential function and configurational sampling. Herein, an energy sampling approach termed the “Movable Type” (MT) method, and a statistical energy function for solvation modeling, “Knowledge-based and Empirical Combined Scoring Algorithm” (KECSA) are developed and utilized to create an implicit solvation model: KECSA-Movable Type Implicit Solvation Model (KMTISM) suitable for the study of chemical and biological systems. KMTISM is an implicit solvation model, but the MT method performs energy sampling at the atom pairwise level. For a specific molecular system, the MT method collects energies from prebuilt databases for the requisite atom pairs at all relevant distance ranges, which by its very construction encodes all possible molecular configurations simultaneously. Unlike traditional statistical energy functions, KECSA converts structural statistical information into categorized atom pairwise interaction energies as a function of the radial distance instead of a mean force energy function. Within the implicit solvent model approximation, aqueous solvation free energies are then obtained from the NVT ensemble partition function generated by the MT method. Validation is performed against several subsets selected from the Minnesota Solvation Database v2012. Results are compared with several solvation free energy calculation methods, including a one-to-one comparison against two commonly used classical implicit solvation models: MM-GBSA and MM-PBSA. Comparison against a quantum mechanics based polarizable continuum model is also discussed (Cramer and Truhlar’s Solvation Model 12). PMID:25691832

  17. Does money matter in inflation forecasting?

    NASA Astrophysics Data System (ADS)

    Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.

    2010-11-01

    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.

  18. Non-Born-Oppenheimer molecular dynamics of the spin-forbidden reaction O(3P) + CO(X 1Σ+) → CO2(tilde X{}^1Σ _g^ +)

    NASA Astrophysics Data System (ADS)

    Jasper, Ahren W.; Dawes, Richard

    2013-10-01

    The lowest-energy singlet (1 1A') and two lowest-energy triplet (1 3A' and 1 3A″) electronic states of CO2 are characterized using dynamically weighted multireference configuration interaction (dw-MRCI+Q) electronic structure theory calculations extrapolated to the complete basis set (CBS) limit. Global analytic representations of the dw-MRCI+Q/CBS singlet and triplet surfaces and of their CASSCF/aug-cc-pVQZ spin-orbit coupling surfaces are obtained via the interpolated moving least squares (IMLS) semiautomated surface fitting method. The spin-forbidden kinetics of the title reaction is calculated using the coupled IMLS surfaces and coherent switches with decay of mixing non-Born-Oppenheimer molecular dynamics. The calculated spin-forbidden association rate coefficient (corresponding to the high pressure limit of the rate coefficient) is 7-35 times larger at 1000-5000 K than the rate coefficient used in many detailed chemical models of combustion. A dynamical analysis of the multistate trajectories is presented. The trajectory calculations reveal direct (nonstatistical) and indirect (statistical) spin-forbidden reaction mechanisms and may be used to test the suitability of transition-state-theory-like statistical methods for spin-forbidden kinetics. Specifically, we consider the appropriateness of the "double passage" approximation, of assuming statistical distributions of seam crossings, and of applications of the unified statistical model for spin-forbidden reactions.

  19. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.

  20. Mechanical properties of experimental composites with different calcium phosphates fillers.

    PubMed

    Okulus, Zuzanna; Voelkel, Adam

    2017-09-01

    Calcium phosphates (CaPs)-containing composites have already shown good properties from the point of view of dental restorative materials. The purpose of this study was to examine the crucial mechanical properties of twelve hydroxyapatite- or tricalcium phosphate-filled composites. The raw and surface-treated forms of both CaP fillers were applied. As a reference materials two experimental glass-containing composites and one commercial dental restorative composite were applied. Nano-hardness, elastic modulus, compressive, flexural and diametral tensile strength of all studied materials were determined. Application of statistical methods (one-way analysis of variance and cluster agglomerative analysis) allowed for assessing the similarities between examined materials according to the values of studied parameters. The obtained results show that in almost all cases the mechanical properties of experimental CaPs-composites are comparable or even better than mechanical properties of examined reference materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Emergent kink statistics at finite temperature

    DOE PAGES

    Lopez-Ruiz, Miguel Angel; Yepez-Martinez, Tochtli; Szczepaniak, Adam; ...

    2017-07-25

    In this paper we use 1D quantum mechanical systems with Higgs-like interaction potential to study the emergence of topological objects at finite temperature. Two different model systems are studied, the standard double-well potential model and a newly introduced discrete kink model. Using Monte-Carlo simulations as well as analytic methods, we demonstrate how kinks become abundant at low temperatures. These results may shed useful insights on how topological phenomena may occur in QCD.

  2. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  3. Systematic review of the use of Statistical Process Control methods to measure the success of pressure ulcer prevention.

    PubMed

    Clark, Michael; Young, Trudie; Fallon, Maureen

    2018-06-01

    Successful prevention of pressure ulcers is the end product of a complex series of care processes including, but not limited to, the assessment of vulnerability to pressure damage; skin assessment and care; nutritional support; repositioning; and the use of beds, mattresses, and cushions to manage mechanical loads on the skin and soft tissues. The purpose of this review was to examine where and how Statistical Process Control (SPC) measures have been used to assess the success of quality improvement initiatives intended to improve pressure ulcer prevention. A search of 7 electronic bibliographic databases was performed on May 17th, 2017, for studies that met the inclusion criteria. SPC methods have been reported in 9 publications since 2010 to interpret changes in the incidence of pressure ulcers over time. While these methods offer rapid interpretation of changes in incidence than is gained from a comparison of 2 arbitrarily selected time points pre- and post-implementation of change, more work is required to ensure that the clinical and scientific communities adopt the most appropriate SPC methods. © 2018 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  4. First principles statistical mechanics of alloys and magnetism

    NASA Astrophysics Data System (ADS)

    Eisenbach, Markus; Khan, Suffian N.; Li, Ying Wai

    Modern high performance computing resources are enabling the exploration of the statistical physics of phase spaces with increasing size and higher fidelity of the Hamiltonian of the systems. For selected systems, this now allows the combination of Density Functional based first principles calculations with classical Monte Carlo methods for parameter free, predictive thermodynamics of materials. We combine our locally selfconsistent real space multiple scattering method for solving the Kohn-Sham equation with Wang-Landau Monte-Carlo calculations (WL-LSMS). In the past we have applied this method to the calculation of Curie temperatures in magnetic materials. Here we will present direct calculations of the chemical order - disorder transitions in alloys. We present our calculated transition temperature for the chemical ordering in CuZn and the temperature dependence of the short-range order parameter and specific heat. Finally we will present the extension of the WL-LSMS method to magnetic alloys, thus allowing the investigation of the interplay of magnetism, structure and chemical order in ferrous alloys. This research was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Science and Engineering Division and it used Oak Ridge Leadership Computing Facility resources at Oak Ridge National Laboratory.

  5. Boosting Bayesian parameter inference of stochastic differential equation models with methods from statistical physics

    NASA Astrophysics Data System (ADS)

    Albert, Carlo; Ulzega, Simone; Stoop, Ruedi

    2016-04-01

    Measured time-series of both precipitation and runoff are known to exhibit highly non-trivial statistical properties. For making reliable probabilistic predictions in hydrology, it is therefore desirable to have stochastic models with output distributions that share these properties. When parameters of such models have to be inferred from data, we also need to quantify the associated parametric uncertainty. For non-trivial stochastic models, however, this latter step is typically very demanding, both conceptually and numerically, and always never done in hydrology. Here, we demonstrate that methods developed in statistical physics make a large class of stochastic differential equation (SDE) models amenable to a full-fledged Bayesian parameter inference. For concreteness we demonstrate these methods by means of a simple yet non-trivial toy SDE model. We consider a natural catchment that can be described by a linear reservoir, at the scale of observation. All the neglected processes are assumed to happen at much shorter time-scales and are therefore modeled with a Gaussian white noise term, the standard deviation of which is assumed to scale linearly with the system state (water volume in the catchment). Even for constant input, the outputs of this simple non-linear SDE model show a wealth of desirable statistical properties, such as fat-tailed distributions and long-range correlations. Standard algorithms for Bayesian inference fail, for models of this kind, because their likelihood functions are extremely high-dimensional intractable integrals over all possible model realizations. The use of Kalman filters is illegitimate due to the non-linearity of the model. Particle filters could be used but become increasingly inefficient with growing number of data points. Hamiltonian Monte Carlo algorithms allow us to translate this inference problem to the problem of simulating the dynamics of a statistical mechanics system and give us access to most sophisticated methods that have been developed in the statistical physics community over the last few decades. We demonstrate that such methods, along with automated differentiation algorithms, allow us to perform a full-fledged Bayesian inference, for a large class of SDE models, in a highly efficient and largely automatized manner. Furthermore, our algorithm is highly parallelizable. For our toy model, discretized with a few hundred points, a full Bayesian inference can be performed in a matter of seconds on a standard PC.

  6. [Application of finite element method in spinal biomechanics].

    PubMed

    Liu, Qiang; Zhang, Jun; Sun, Shu-Chun; Wang, Fei

    2017-02-25

    The finite element model is one of the most important methods in study of modern spinal biomechanics, according to the needs to simulate the various states of the spine, calculate the stress force and strain distribution of the different groups in the state, and explore its principle of mechanics, mechanism of injury, and treatment effectiveness. In addition, in the study of the pathological state of the spine, the finite element is mainly used in the understanding the mechanism of lesion location, evaluating the effects of different therapeutic tool, assisting and completing the selection and improvement of therapeutic tool, in order to provide a theoretical basis for the rehabilitation of spinal lesions. Finite element method can be more provide the service for the patients suffering from spinal correction, operation and individual implant design. Among the design and performance evaluation of the implant need to pay attention to the individual difference and perfect the evaluation system. At present, how to establish a model which is more close to the real situation has been the focus and difficulty of the study of human body's finite element.Although finite element method can better simulate complex working condition, it is necessary to improve the authenticity of the model and the sharing of the group by using many kinds of methods, such as image science, statistics, kinematics and so on. Copyright© 2017 by the China Journal of Orthopaedics and Traumatology Press.

  7. Study of optimum methods of optical communication

    NASA Technical Reports Server (NTRS)

    Harger, R. O.

    1972-01-01

    Optimum methods of optical communication accounting for the effects of the turbulent atmosphere and quantum mechanics, both by the semi-classical method and the full-fledged quantum theoretical model are described. A concerted effort to apply the techniques of communication theory to the novel problems of optical communication by a careful study of realistic models and their statistical descriptions, the finding of appropriate optimum structures and the calculation of their performance and, insofar as possible, comparing them to conventional and other suboptimal systems are discussed. In this unified way the bounds on performance and the structure of optimum communication systems for transmission of information, imaging, tracking, and estimation can be determined for optical channels.

  8. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  9. A review on lithium-ion battery ageing mechanisms and estimations for automotive applications

    NASA Astrophysics Data System (ADS)

    Barré, Anthony; Deguilhem, Benjamin; Grolleau, Sébastien; Gérard, Mathias; Suard, Frédéric; Riu, Delphine

    2013-11-01

    Lithium-ion batteries have become the focus of research interest, thanks to their numerous benefits for vehicle applications. One main limitation of these technologies resides in the battery ageing. The effects of battery ageing limit its performance and occur throughout their whole life, whether the battery is used or not, which is a major drawback on real usage. Furthermore, degradations take place in every condition, but in different proportions as usage and external conditions interact to provoke degradations. The ageing phenomena are highly complicated to characterize due to the factors cross-dependence. This paper reviews various aspects of recent research and developments, from different fields, on lithium-ion battery ageing mechanisms and estimations. In this paper is presented a summary of techniques, models and algorithms used for battery ageing estimation (SOH, RUL), going from a detailed electrochemical approach to statistical methods based on data. In order to present the accuracy of currently used methods, their respective characteristics are discussed. Remaining challenges are deeply detailed, along with a discussion about the ideal method resulting from existing methods.

  10. Synthetic velocity gradient tensors and the identification of statistically significant aspects of the structure of turbulence

    NASA Astrophysics Data System (ADS)

    Keylock, Christopher J.

    2017-08-01

    A method is presented for deriving random velocity gradient tensors given a source tensor. These synthetic tensors are constrained to lie within mathematical bounds of the non-normality of the source tensor, but we do not impose direct constraints upon scalar quantities typically derived from the velocity gradient tensor and studied in fluid mechanics. Hence, it becomes possible to ask hypotheses of data at a point regarding the statistical significance of these scalar quantities. Having presented our method and the associated mathematical concepts, we apply it to homogeneous, isotropic turbulence to test the utility of the approach for a case where the behavior of the tensor is understood well. We show that, as well as the concentration of data along the Vieillefosse tail, actual turbulence is also preferentially located in the quadrant where there is both excess enstrophy (Q>0 ) and excess enstrophy production (R<0 ). We also examine the topology implied by the strain eigenvalues and find that for the statistically significant results there is a particularly strong relative preference for the formation of disklike structures in the (Q<0 ,R<0 ) quadrant. With the method shown to be useful for a turbulence that is already understood well, it should be of even greater utility for studying complex flows seen in industry and the environment.

  11. Occupational Dermatoses by Type of Work in Greece

    PubMed Central

    Zorba, Eleni; Karpouzis, Antony; Zorbas, Alexandros; Bazas, Theodore; Zorbas, Sam; Alexopoulos, Elias; Zorbas, Ilias; Kouskoukis, Konstantinos; Konstandinidis, Theodoros

    2013-01-01

    Background To elucidate the relationship between seven occupational dermatoses (ODs) and 20 types of work in Greece. Methods This was a prevalence epidemiologic study of certain ODs among 4,000 workers employed in 20 types of enterprise, in 104 companies, in 2006–2012, using data from company medical records, questionnaires, occupational medical, and special examinations. The χ2 test was applied to reveal statistically significant relationships between types of enterprises and occurrence of ODs. Results A high percentage (39.9%) of employees included in the study population suffered from ODs. The highest prevalence rates were noted among hairdressers (of contact dermatitis: 30%), cooks (of contact dermatitis: 29.5%), bitumen workers (of acne: 23.5%), car industry workers (of mechanical injury: 15%), construction workers (of contact urticaria: 29.5%), industrial cleaning workers (of chemical burns: 13%), and farmers (of malignant tumors: 5.5%). We observed several statistical significant correlations between ODs (acute and chronic contact dermatitis, urticaria, mechanical injury, acne, burns, skin cancer) and certain types of enterprises. There was no statistically significant correlation between gender and prevalence of ODs, except for dermatoses caused by mechanical injuries afflicting mainly men [χ2 (1) = 13.40, p < 0.001] and for chronic contact dermatitis [χ2 (1) = 5.53, p = 0.019] afflicting mainly women. Conclusion Prevalence of ODs is high in Greece, contrary to all official reports by the Greek National Institute of Health. There is a need to introduce a nationwide voluntary surveillance system for reporting ODs and to enhance skin protection measures at work. PMID:24106644

  12. Renormalization Group Tutorial

    NASA Technical Reports Server (NTRS)

    Bell, Thomas L.

    2004-01-01

    Complex physical systems sometimes have statistical behavior characterized by power- law dependence on the parameters of the system and spatial variability with no particular characteristic scale as the parameters approach critical values. The renormalization group (RG) approach was developed in the fields of statistical mechanics and quantum field theory to derive quantitative predictions of such behavior in cases where conventional methods of analysis fail. Techniques based on these ideas have since been extended to treat problems in many different fields, and in particular, the behavior of turbulent fluids. This lecture will describe a relatively simple but nontrivial example of the RG approach applied to the diffusion of photons out of a stellar medium when the photons have wavelengths near that of an emission line of atoms in the medium.

  13. Dependence of prevalence of contiguous pathways in proteins on structural complexity.

    PubMed

    Thayer, Kelly M; Galganov, Jesse C; Stein, Avram J

    2017-01-01

    Allostery is a regulatory mechanism in proteins where an effector molecule binds distal from an active site to modulate its activity. Allosteric signaling may occur via a continuous path of residues linking the active and allosteric sites, which has been suggested by large conformational changes evident in crystal structures. An alternate possibility is that the signal occurs in the realm of ensemble dynamics via an energy landscape change. While the latter was first proposed on theoretical grounds, increasing evidence suggests that such a control mechanism is plausible. A major difficulty for testing the two methods is the ability to definitively determine that a residue is directly involved in allosteric signal transduction. Statistical Coupling Analysis (SCA) is a method that has been successful at predicting pathways, and experimental tests involving mutagenesis or domain substitution provide the best available evidence of signaling pathways. However, ascertaining energetic pathways which need not be contiguous is far more difficult. To date, simple estimates of the statistical significance of a pathway in a protein remain to be established. The focus of this work is to estimate such benchmarks for the statistical significance of contiguous pathways for the null model of selecting residues at random. We found that when 20% of residues in proteins are randomly selected, contiguous pathways at the 6 Å cutoff level were found with success rates of 51% in PDZ, 30% in p53, and 3% in MutS. The results suggest that the significance of pathways may have system specific factors involved. Furthermore, the possible existence of false positives for contiguous pathways implies that signaling could be occurring via alternate routes including those consistent with the energetic landscape model.

  14. Development and Validation of a Statistical Shape Modeling-Based Finite Element Model of the Cervical Spine Under Low-Level Multiple Direction Loading Conditions

    PubMed Central

    Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.

    2014-01-01

    Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051

  15. Biophotons, coherence and photocount statistics: A critical review

    NASA Astrophysics Data System (ADS)

    Cifra, Michal; Brouder, Christian; Nerudová, Michaela; Kučera, Ondřej

    2015-08-01

    Biological samples continuously emit ultra-weak photon emission (UPE, or "biophotons") which stems from electronic excited states generated chemically during oxidative metabolism and stress. Thus, UPE can potentially serve as a method for non-invasive diagnostics of oxidative processes or, if discovered, also of other processes capable of electron excitation. While the fundamental generating mechanisms of UPE are fairly elucidated together with their approximate ranges of intensities and spectra, statistical properties of UPE is still a highly challenging topic. Here we review claims about nontrivial statistical properties of UPE, such as coherence and squeezed states of light. After introduction to the necessary theory, we categorize the experimental works of all authors to those with solid, conventional interpretation and those with unconventional and even speculative interpretation. The conclusion of our review is twofold; while the phenomenon of UPE from biological systems can be considered experimentally well established, no reliable evidence for the coherence or nonclassicality of UPE was actually achieved up to now. Furthermore, we propose perspective avenues in the research of statistical properties of biological UPE.

  16. Accounting for the Multiple Natures of Missing Values in Label-Free Quantitative Proteomics Data Sets to Compare Imputation Strategies.

    PubMed

    Lazar, Cosmin; Gatto, Laurent; Ferro, Myriam; Bruley, Christophe; Burger, Thomas

    2016-04-01

    Missing values are a genuine issue in label-free quantitative proteomics. Recent works have surveyed the different statistical methods to conduct imputation and have compared them on real or simulated data sets and recommended a list of missing value imputation methods for proteomics application. Although insightful, these comparisons do not account for two important facts: (i) depending on the proteomics data set, the missingness mechanism may be of different natures and (ii) each imputation method is devoted to a specific type of missingness mechanism. As a result, we believe that the question at stake is not to find the most accurate imputation method in general but instead the most appropriate one. We describe a series of comparisons that support our views: For instance, we show that a supposedly "under-performing" method (i.e., giving baseline average results), if applied at the "appropriate" time in the data-processing pipeline (before or after peptide aggregation) on a data set with the "appropriate" nature of missing values, can outperform a blindly applied, supposedly "better-performing" method (i.e., the reference method from the state-of-the-art). This leads us to formulate few practical guidelines regarding the choice and the application of an imputation method in a proteomics context.

  17. Physical concepts in the development of constitutive equations

    NASA Technical Reports Server (NTRS)

    Cassenti, B. N.

    1985-01-01

    Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.

  18. Methods of comparing associative models and an application to retrospective revaluation.

    PubMed

    Witnauer, James E; Hutchings, Ryan; Miller, Ralph R

    2017-11-01

    Contemporary theories of associative learning are increasingly complex, which necessitates the use of computational methods to reveal predictions of these models. We argue that comparisons across multiple models in terms of goodness of fit to empirical data from experiments often reveal more about the actual mechanisms of learning and behavior than do simulations of only a single model. Such comparisons are best made when the values of free parameters are discovered through some optimization procedure based on the specific data being fit (e.g., hill climbing), so that the comparisons hinge on the psychological mechanisms assumed by each model rather than being biased by using parameters that differ in quality across models with respect to the data being fit. Statistics like the Bayesian information criterion facilitate comparisons among models that have different numbers of free parameters. These issues are examined using retrospective revaluation data. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. The better way to determine the validity, reliability, objectivity and accuracy of measuring devices.

    PubMed

    Pazira, Parvin; Rostami Haji-Abadi, Mahdi; Zolaktaf, Vahid; Sabahi, Mohammadfarzan; Pazira, Toomaj

    2016-06-08

    In relation to statistical analysis, studies to determine the validity, reliability, objectivity and precision of new measuring devices are usually incomplete, due in part to using only correlation coefficient and ignoring the data dispersion. The aim of this study was to demonstrate the best way to determine the validity, reliability, objectivity and accuracy of an electro-inclinometer or other measuring devices. Another purpose of this study is to answer the question of whether reliability and objectivity represent accuracy of measuring devices. The validity of an electro-inclinometer was examined by mechanical and geometric methods. The objectivity and reliability of the device was assessed by calculating Cronbach's alpha for repeated measurements by three raters and by measurements on the same person by mechanical goniometer and the electro-inclinometer. Measurements were performed on "hip flexion with the extended knee" and "shoulder abduction with the extended elbow." The raters measured every angle three times within an interval of two hours. The three-way ANOVA was used to determine accuracy. The results of mechanical and geometric analysis showed that validity of the electro-inclinometer was 1.00 and level of error was less than one degree. Objectivity and reliability of electro-inclinometer was 0.999, while objectivity of mechanical goniometer was in the range of 0.802 to 0.966 and the reliability was 0.760 to 0.961. For hip flexion, the difference between raters in joints angle measurement by electro-inclinometer and mechanical goniometer was 1.74 and 16.33 degree (P<0.05), respectively. The differences for shoulder abduction measurement by electro-inclinometer and goniometer were 0.35 and 4.40 degree (P<0.05). Although both the objectivity and reliability are acceptable, the results showed that measurement error was very high in the mechanical goniometer. Therefore, it can be concluded that objectivity and reliability alone cannot determine the accuracy of a device and it is preferable to use other statistical methods to compare and evaluate the accuracy of these two devices.

  20. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  1. Mechanical characterization of structurally porous biomaterials built via additive manufacturing: experiments, predictive models, and design maps for load-bearing bone replacement implants.

    PubMed

    Melancon, D; Bagheri, Z S; Johnston, R B; Liu, L; Tanzer, M; Pasini, D

    2017-11-01

    Porous biomaterials can be additively manufactured with micro-architecture tailored to satisfy the stringent mechano-biological requirements imposed by bone replacement implants. In a previous investigation, we introduced structurally porous biomaterials, featuring strength five times stronger than commercially available porous materials, and confirmed their bone ingrowth capability in an in vivo canine model. While encouraging, the manufactured biomaterials showed geometric mismatches between their internal porous architecture and that of its as-designed counterpart, as well as discrepancies between predicted and tested mechanical properties, issues not fully elucidated. In this work, we propose a systematic approach integrating computed tomography, mechanical testing, and statistical analysis of geometric imperfections to generate statistical based numerical models of high-strength additively manufactured porous biomaterials. The method is used to develop morphology and mechanical maps that illustrate the role played by pore size, porosity, strut thickness, and topology on the relations governing their elastic modulus and compressive yield strength. Overall, there are mismatches between the mechanical properties of ideal-geometry models and as-manufactured porous biomaterials with average errors of 49% and 41% respectively for compressive elastic modulus and yield strength. The proposed methodology gives more accurate predictions for the compressive stiffness and the compressive strength properties with a reduction of the average error to 11% and 7.6%. The implications of the results and the methodology here introduced are discussed in the relevant biomechanical and clinical context, with insight that highlights promises and limitations of additively manufactured porous biomaterials for load-bearing bone replacement implants. In this work, we perform mechanical characterization of load-bearing porous biomaterials for bone replacement over their entire design space. Results capture the shift in geometry and mechanical properties between as-designed and as-manufactured biomaterials induced by additive manufacturing. Characterization of this shift is crucial to ensure appropriate manufacturing of bone replacement implants that enable biological fixation through bone ingrowth as well as mechanical property harmonization with the native bone tissue. In addition, we propose a method to include manufacturing imperfections in the numerical models that can reduce the discrepancy between predicted and tested properties. The results give insight into the use of structurally porous biomaterials for the design and additive fabrication of load-bearing implants for bone replacement. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  2. The effects of modeling instruction on high school physics academic achievement

    NASA Astrophysics Data System (ADS)

    Wright, Tiffanie L.

    The purpose of this study was to explore whether Modeling Instruction, compared to traditional lecturing, is an effective instructional method to promote academic achievement in selected high school physics classes at a rural middle Tennessee high school. This study used an ex post facto , quasi-experimental research methodology. The independent variables in this study were the instructional methods of teaching. The treatment variable was Modeling Instruction and the control variable was traditional lecture instruction. The Treatment Group consisted of participants in Physical World Concepts who received Modeling Instruction. The Control Group consisted of participants in Physical Science who received traditional lecture instruction. The dependent variable was gains scores on the Force Concepts Inventory (FCI). The participants for this study were 133 students each in both the Treatment and Control Groups (n = 266), who attended a public, high school in rural middle Tennessee. The participants were administered the Force Concepts Inventory (FCI) prior to being taught the mechanics of physics. The FCI data were entered into the computer-based Statistical Package for the Social Science (SPSS). Two independent samples t-tests were conducted to answer the research questions. There was a statistically significant difference between the treatment and control groups concerning the instructional method. Modeling Instructional methods were found to be effective in increasing the academic achievement of students in high school physics. There was no statistically significant difference between FCI gains scores for gender. Gender was found to have no effect on the academic achievement of students in high school physics classes. However, even though there was not a statistically significant difference, female students' gains scores were higher than male students' gains scores when Modeling Instructional methods of teaching were used. Based on these findings, it is recommended that high school science teachers should use Modeling Instructional methods of teaching daily in their classrooms. A recommendation for further research is to expand the Modeling Instructional methods of teaching into different content areas, (i.e., reading and language arts) to explore academic achievement gains.

  3. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    NASA Astrophysics Data System (ADS)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  4. Epidemics in Ming and Qing China: Impacts of changes of climate and economic well-being.

    PubMed

    Pei, Qing; Zhang, David D; Li, Guodong; Winterhalder, Bruce; Lee, Harry F

    2015-07-01

    We investigated the mechanism of epidemics with the impacts of climate change and socio-economic fluctuations in the Ming and Qing Dynasties in China (AD 1368-1901). Using long-term and high-quality datasets, this study is the first quantitative research that verifies the 'climate change → economy → epidemics' mechanism in historical China by statistical methods that include correlation analysis, Granger causality analysis, ARX, and Poisson-ARX modeling. The analysis provides the evidences that climate change could only fundamentally lead to the epidemics spread and occurrence, but the depressed economic well-being is the direct trigger of epidemics spread and occurrence at the national and long term scale in historical China. Moreover, statistical modeling shows that economic well-being is more important than population pressure in the mechanism of epidemics. However, population pressure remains a key element in determining the social vulnerability of the epidemics occurrence under climate change. Notably, the findings not only support adaptation theories but also enhance our confidence to address climatic shocks if economic buffering capacity can be promoted steadily. The findings can be a basis for scientists and policymakers in addressing global and regional environmental changes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Competitive Processes in Cross-Situational Word Learning

    PubMed Central

    Yurovsky, Daniel; Yu, Chen; Smith, Linda B.

    2013-01-01

    Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. But the information that learners pick up from these regularities is dependent on their learning mechanism. This paper investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input, and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales – both within and across trials/situations –learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is then considered from the perspective of a process-level understanding of cross-situational learning. PMID:23607610

  6. Competitive processes in cross-situational word learning.

    PubMed

    Yurovsky, Daniel; Yu, Chen; Smith, Linda B

    2013-07-01

    Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. However, the information that learners pick up from these regularities is dependent on their learning mechanism. This article investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales-both within and across trials/situations-learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is considered from the perspective of a process-level understanding of cross-situational learning. Copyright © 2013 Cognitive Science Society, Inc.

  7. A Conditional Curie-Weiss Model for Stylized Multi-group Binary Choice with Social Interaction

    NASA Astrophysics Data System (ADS)

    Opoku, Alex Akwasi; Edusei, Kwame Owusu; Ansah, Richard Kwame

    2018-04-01

    This paper proposes a conditional Curie-Weiss model as a model for decision making in a stylized society made up of binary decision makers that face a particular dichotomous choice between two options. Following Brock and Durlauf (Discrete choice with social interaction I: theory, 1955), we set-up both socio-economic and statistical mechanical models for the choice problem. We point out when both the socio-economic and statistical mechanical models give rise to the same self-consistent equilibrium mean choice level(s). Phase diagram of the associated statistical mechanical model and its socio-economic implications are discussed.

  8. Ballistic and diffusive dynamics in a two-dimensional ideal gas of macroscopic chaotic Faraday waves.

    PubMed

    Welch, Kyle J; Hastings-Hauss, Isaac; Parthasarathy, Raghuveer; Corwin, Eric I

    2014-04-01

    We have constructed a macroscopic driven system of chaotic Faraday waves whose statistical mechanics, we find, are surprisingly simple, mimicking those of a thermal gas. We use real-time tracking of a single floating probe, energy equipartition, and the Stokes-Einstein relation to define and measure a pseudotemperature and diffusion constant and then self-consistently determine a coefficient of viscous friction for a test particle in this pseudothermal gas. Because of its simplicity, this system can serve as a model for direct experimental investigation of nonequilibrium statistical mechanics, much as the ideal gas epitomizes equilibrium statistical mechanics.

  9. Joint Inversion of Source Location and Source Mechanism of Induced Microseismics

    NASA Astrophysics Data System (ADS)

    Liang, C.

    2014-12-01

    Seismic source mechanism is a useful property to indicate the source physics and stress and strain distribution in regional, local and micro scales. In this study we jointly invert source mechanisms and locations for microseismics induced in fluid fracturing treatment in the oil and gas industry. For the events that are big enough to see waveforms, there are quite a few techniques can be applied to invert the source mechanism including waveform inversion, first polarity inversion and many other methods and variants based on these methods. However, for events that are too small to identify in seismic traces such as the microseismics induced by the fluid fracturing in the Oil and Gas industry, a source scanning algorithms (SSA for short) with waveform stacking are usually applied. At the same time, a joint inversion of location and source mechanism are possible but at a cost of high computation budget. The algorithm is thereby called Source Location and Mechanism Scanning Algorithm, SLMSA for short. In this case, for given velocity structure, all possible combinations of source locations (X,Y and Z) and source mechanism (Strike, Dip and Rake) are used to compute travel-times and polarities of waveforms. Correcting Normal moveout times and polarities, and stacking all waveforms, the (X, Y, Z , strike, dip, rake) combination that gives the strongest stacking waveform is identified as the solution. To solve the problem of high computation problem, CPU-GPU programing is applied. Numerical datasets are used to test the algorithm. The SLMSA has also been applied to a fluid fracturing datasets and reveal several advantages against the location only method: (1) for shear sources, the source only program can hardly locate them because of the canceling out of positive and negative polarized traces, but the SLMSA method can successfully pick up those events; (2) microseismic locations alone may not be enough to indicate the directionality of micro-fractures. The statistics of source mechanisms can certainly provide more knowledges on the orientation of fractures; (3) in our practice, the joint inversion method almost always yield more events than the source only method and for those events that are also picked by the SSA method, the stacking power of SLMSA are always higher than the ones obtained in SSA.

  10. Data Science in the Research Domain Criteria Era: Relevance of Machine Learning to the Study of Stress Pathology, Recovery, and Resilience

    PubMed Central

    Galatzer-Levy, Isaac R.; Ruggles, Kelly; Chen, Zhe

    2017-01-01

    Diverse environmental and biological systems interact to influence individual differences in response to environmental stress. Understanding the nature of these complex relationships can enhance the development of methods to: (1) identify risk, (2) classify individuals as healthy or ill, (3) understand mechanisms of change, and (4) develop effective treatments. The Research Domain Criteria (RDoC) initiative provides a theoretical framework to understand health and illness as the product of multiple inter-related systems but does not provide a framework to characterize or statistically evaluate such complex relationships. Characterizing and statistically evaluating models that integrate multiple levels (e.g. synapses, genes, environmental factors) as they relate to outcomes that a free from prior diagnostic benchmarks represents a challenge requiring new computational tools that are capable to capture complex relationships and identify clinically relevant populations. In the current review, we will summarize machine learning methods that can achieve these goals. PMID:29527592

  11. Quantum Biometrics with Retinal Photon Counting

    NASA Astrophysics Data System (ADS)

    Loulakis, M.; Blatsios, G.; Vrettou, C. S.; Kominis, I. K.

    2017-10-01

    It is known that the eye's scotopic photodetectors, rhodopsin molecules, and their associated phototransduction mechanism leading to light perception, are efficient single-photon counters. We here use the photon-counting principles of human rod vision to propose a secure quantum biometric identification based on the quantum-statistical properties of retinal photon detection. The photon path along the human eye until its detection by rod cells is modeled as a filter having a specific transmission coefficient. Precisely determining its value from the photodetection statistics registered by the conscious observer is a quantum parameter estimation problem that leads to a quantum secure identification method. The probabilities for false-positive and false-negative identification of this biometric technique can readily approach 10-10 and 10-4, respectively. The security of the biometric method can be further quantified by the physics of quantum measurements. An impostor must be able to perform quantum thermometry and quantum magnetometry with energy resolution better than 10-9ℏ , in order to foil the device by noninvasively monitoring the biometric activity of a user.

  12. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  13. Characterizing the lung tissue mechanical properties using a micromechanical model of alveolar sac

    NASA Astrophysics Data System (ADS)

    Karami, Elham; Seify, Behzad; Moghadas, Hadi; Sabsalinejad, Masoomeh; Lee, Ting-Yim; Samani, Abbas

    2017-03-01

    According to statistics, lung disease is among the leading causes of death worldwide. As such, many research groups are developing powerful tools for understanding, diagnosis and treatment of various lung diseases. Recently, biomechanical modeling has emerged as an effective tool for better understanding of human physiology, disease diagnosis and computer assisted medical intervention. Mechanical properties of lung tissue are important requirements for methods developed for lung disease diagnosis and medical intervention. As such, the main objective of this study is to develop an effective tool for estimating the mechanical properties of normal and pathological lung parenchyma tissue based on its microstructure. For this purpose, a micromechanical model of the lung tissue was developed using finite element (FE) method, and the model was demonstrated to have application in estimating the mechanical properties of lung alveolar wall. The proposed model was developed by assembling truncated octahedron tissue units resembling the alveoli. A compression test was simulated using finite element method on the created geometry and the hyper-elastic parameters of the alveoli wall were calculated using reported alveolar wall stress-strain data and an inverse optimization framework. Preliminary results indicate that the proposed model can be potentially used to reconstruct microstructural images of lung tissue using macro-scale tissue response for normal and different pathological conditions. Such images can be used for effective diagnosis of lung diseases such as Chronic Obstructive Pulmonary Disease (COPD).

  14. Strength/Brittleness Classification of Igneous Intact Rocks Based on Basic Physical and Dynamic Properties

    NASA Astrophysics Data System (ADS)

    Aligholi, Saeed; Lashkaripour, Gholam Reza; Ghafoori, Mohammad

    2017-01-01

    This paper sheds further light on the fundamental relationships between simple methods, rock strength, and brittleness of igneous rocks. In particular, the relationship between mechanical (point load strength index I s(50) and brittleness value S 20), basic physical (dry density and porosity), and dynamic properties (P-wave velocity and Schmidt rebound values) for a wide range of Iranian igneous rocks is investigated. First, 30 statistical models (including simple and multiple linear regression analyses) were built to identify the relationships between mechanical properties and simple methods. The results imply that rocks with different Schmidt hardness (SH) rebound values have different physicomechanical properties or relations. Second, using these results, it was proved that dry density, P-wave velocity, and SH rebound value provide a fine complement to mechanical properties classification of rock materials. Further, a detailed investigation was conducted on the relationships between mechanical and simple tests, which are established with limited ranges of P-wave velocity and dry density. The results show that strength values decrease with the SH rebound value. In addition, there is a systematic trend between dry density, P-wave velocity, rebound hardness, and brittleness value of the studied rocks, and rocks with medium hardness have a higher brittleness value. Finally, a strength classification chart and a brittleness classification table are presented, providing reliable and low-cost methods for the classification of igneous rocks.

  15. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).

    PubMed

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal

    2016-01-01

    This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.

  16. Very large scale characterization of graphene mechanical devices using a colorimetry technique.

    PubMed

    Cartamil-Bueno, Santiago Jose; Centeno, Alba; Zurutuza, Amaia; Steeneken, Peter Gerard; van der Zant, Herre Sjoerd Jan; Houri, Samer

    2017-06-08

    We use a scalable optical technique to characterize more than 21 000 circular nanomechanical devices made of suspended single- and double-layer graphene on cavities with different diameters (D) and depths (g). To maximize the contrast between suspended and broken membranes we used a model for selecting the optimal color filter. The method enables parallel and automatized image processing for yield statistics. We find the survival probability to be correlated with a structural mechanics scaling parameter given by D 4 /g 3 . Moreover, we extract a median adhesion energy of Γ = 0.9 J m -2 between the membrane and the native SiO 2 at the bottom of the cavities.

  17. Statistical-mechanical predictions and Navier-Stokes dynamics of two-dimensional flows on a bounded domain.

    PubMed

    Brands, H; Maassen, S R; Clercx, H J

    1999-09-01

    In this paper the applicability of a statistical-mechanical theory to freely decaying two-dimensional (2D) turbulence on a bounded domain is investigated. We consider an ensemble of direct numerical simulations in a square box with stress-free boundaries, with a Reynolds number that is of the same order as in experiments on 2D decaying Navier-Stokes turbulence. The results of these simulations are compared with the corresponding statistical equilibria, calculated from different stages of the evolution. It is shown that the statistical equilibria calculated from early times of the Navier-Stokes evolution do not correspond to the dynamical quasistationary states. At best, the global topological structure is correctly predicted from a relatively late time in the Navier-Stokes evolution, when the quasistationary state has almost been reached. This failure of the (basically inviscid) statistical-mechanical theory is related to viscous dissipation and net leakage of vorticity in the Navier-Stokes dynamics at moderate values of the Reynolds number.

  18. Nonlinear sigma models with compact hyperbolic target spaces

    NASA Astrophysics Data System (ADS)

    Gubser, Steven; Saleem, Zain H.; Schoenholz, Samuel S.; Stoica, Bogdan; Stokes, James

    2016-06-01

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model [1, 2]. Unlike in the O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. The diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.

  19. Statistical thermodynamics unveils the dissolution mechanism of cellobiose.

    PubMed

    Nicol, Thomas W J; Isobe, Noriyuki; Clark, James H; Shimizu, Seishi

    2017-08-30

    In the study of the cellulose dissolution mechanism opinion is still divided. Here, the solution interaction components of the most prominent hypotheses for the driving force of cellulose dissolution were evaluated quantitatively. Combining a rigorous statistical thermodynamic theory and cellobiose solubility data in the presence of chloride salts, whose cations progress in the Hofmeister series (KCl, NaCl, LiCl and ZnCl 2 ), we have shown that cellobiose solubilization is driven by the preferential accumulation of salts around the solutes which is stronger than cellobiose hydration. Yet contrary to the classical chaotropy hypothesis, increasing salt concentration leads to cellobiose dehydration in the presence of the strongest solubilizer ZnCl 2 . However, thanks to cellobiose dehydration, cellobiose-salt interaction still remains preferential despite weakening salt accumulation. Based on such insights, the previous hypotheses based on hydrophobicity and polymer charging have also been evaluated quantitatively. Thus, our present study successfully paved a way towards identifying the basic driving forces for cellulose solubilization in a quantitative manner for the first time. When combined with unit additivity methods this quantitative information could lead to a full understanding of cellulose solubility.

  20. Examination of Anxiety Sensitivity and Distress Tolerance as Transdiagnostic Mechanisms Linking Multiple Anxiety Pathologies to Alcohol Use Problems in Adolescents

    PubMed Central

    Wolitzky-Taylor, Kate; Guillot, Casey R.; Pang, Raina D.; Kirkpatrick, Matthew G.; Zvolensky, Michael J.; Buckner, Julia D.; Leventhal, Adam M.

    2015-01-01

    Background Multiple forms of anxiety psychopathology are associated with alcohol use problems in adolescents. Yet, the mechanisms underlying this association are unclear. Anxiety sensitivity (AS) and distress tolerance (DT) represent 2 distinct, conceptually relevant transdiagnostic constructs implicated in multiple manifestations of anxiety that may also underlie alcohol use problems and thereby explain why people with anxiety are more likely to have alcohol problems. Methods The current cross-sectional study examined whether AS and DT accounted for (i.e., statistically mediated) the relationship between manifest indicators of the 3 common anxiety phenotypes (generalized anxiety, social anxiety, and panic disorders) and alcohol problems in a sample of 534 high school students (14 to 15 years old). Results Multiple manifestations of anxiety were associated with greater alcohol use problems. AS statistically mediated multiple anxiety–alcohol associations, but DT did not. Conclusions These findings provide preliminary evidence suggesting AS may be an important transdiagnostic target for alcohol prevention programs for those in early adolescence that experience elevated anxiety symptoms. PMID:25706521

  1. Statistical Mechanical Analysis of Online Learning with Weight Normalization in Single Layer Perceptron

    NASA Astrophysics Data System (ADS)

    Yoshida, Yuki; Karakida, Ryo; Okada, Masato; Amari, Shun-ichi

    2017-04-01

    Weight normalization, a newly proposed optimization method for neural networks by Salimans and Kingma (2016), decomposes the weight vector of a neural network into a radial length and a direction vector, and the decomposed parameters follow their steepest descent update. They reported that learning with the weight normalization achieves better converging speed in several tasks including image recognition and reinforcement learning than learning with the conventional parameterization. However, it remains theoretically uncovered how the weight normalization improves the converging speed. In this study, we applied a statistical mechanical technique to analyze on-line learning in single layer linear and nonlinear perceptrons with weight normalization. By deriving order parameters of the learning dynamics, we confirmed quantitatively that weight normalization realizes fast converging speed by automatically tuning the effective learning rate, regardless of the nonlinearity of the neural network. This property is realized when the initial value of the radial length is near the global minimum; therefore, our theory suggests that it is important to choose the initial value of the radial length appropriately when using weight normalization.

  2. Nonlinear sigma models with compact hyperbolic target spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gubser, Steven; Saleem, Zain H.; Schoenholz, Samuel S.

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model [1, 2]. Unlike in themore » O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. In conclusion, the diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.« less

  3. Nonlinear sigma models with compact hyperbolic target spaces

    DOE PAGES

    Gubser, Steven; Saleem, Zain H.; Schoenholz, Samuel S.; ...

    2016-06-23

    We explore the phase structure of nonlinear sigma models with target spaces corresponding to compact quotients of hyperbolic space, focusing on the case of a hyperbolic genus-2 Riemann surface. The continuum theory of these models can be approximated by a lattice spin system which we simulate using Monte Carlo methods. The target space possesses interesting geometric and topological properties which are reflected in novel features of the sigma model. In particular, we observe a topological phase transition at a critical temperature, above which vortices proliferate, reminiscent of the Kosterlitz-Thouless phase transition in the O(2) model [1, 2]. Unlike in themore » O(2) case, there are many different types of vortices, suggesting a possible analogy to the Hagedorn treatment of statistical mechanics of a proliferating number of hadron species. Below the critical temperature the spins cluster around six special points in the target space known as Weierstrass points. In conclusion, the diversity of compact hyperbolic manifolds suggests that our model is only the simplest example of a broad class of statistical mechanical models whose main features can be understood essentially in geometric terms.« less

  4. Proceedings of the Annual Mechanics of Composites Review (12th) Held in Wright-Patterson AFB, Ohio on 16-17 October 1987

    DTIC Science & Technology

    1988-01-01

    ignored but the Volkersen model is extended to include adherend deformations will be discussed. STATISTICAL METHODOLOGY FOR DESIGN ALLOWABLES [15-17...structure. In the certification methodology , the development test program and the calculation of composite design allowables is orchestrated to support...Development of design methodology of thick composites and their test methods. (b) Role of interface in emerging composite systems. *CONTRACTS IMPROVED DAMAGE

  5. Bose-Einstein distribution of money in a free-market economy. II

    NASA Astrophysics Data System (ADS)

    Kürten, K. E.; Kusmartsev, F. V.

    2011-01-01

    We argue about the application of methods of statistical mechanics to free economy (Kusmartsev F. V., Phys. Lett. A, 375 (2011) 966) and find that the most general distribution of money or income in a free-market economy has a general Bose-Einstein distribution form. Therewith the market is described by three parameters: temperature, chemical potential and the space dimensionality. Numerical simulations and a detailed analysis of a generic model confirm this finding.

  6. Unbiased estimators for spatial distribution functions of classical fluids

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.; Jarzynski, Christopher

    2005-01-01

    We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.

  7. Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Ohzeki, Masayuki

    2013-09-01

    In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.

  8. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)

    PubMed Central

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal

    2016-01-01

    Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365

  9. Third law of thermodynamics as a key test of generalized entropies.

    PubMed

    Bento, E P; Viswanathan, G M; da Luz, M G E; Silva, R

    2015-02-01

    The laws of thermodynamics constrain the formulation of statistical mechanics at the microscopic level. The third law of thermodynamics states that the entropy must vanish at absolute zero temperature for systems with nondegenerate ground states in equilibrium. Conversely, the entropy can vanish only at absolute zero temperature. Here we ask whether or not generalized entropies satisfy this fundamental property. We propose a direct analytical procedure to test if a generalized entropy satisfies the third law, assuming only very general assumptions for the entropy S and energy U of an arbitrary N-level classical system. Mathematically, the method relies on exact calculation of β=dS/dU in terms of the microstate probabilities p(i). To illustrate this approach, we present exact results for the two best known generalizations of statistical mechanics. Specifically, we study the Kaniadakis entropy S(κ), which is additive, and the Tsallis entropy S(q), which is nonadditive. We show that the Kaniadakis entropy correctly satisfies the third law only for -1<κ<+1, thereby shedding light on why κ is conventionally restricted to this interval. Surprisingly, however, the Tsallis entropy violates the third law for q<1. Finally, we give a concrete example of the power of our proposed method by applying it to a paradigmatic system: the one-dimensional ferromagnetic Ising model with nearest-neighbor interactions.

  10. Quantum Mechanics and the Principle of Least Radix Economy

    NASA Astrophysics Data System (ADS)

    Garcia-Morales, Vladimir

    2015-03-01

    A new variational method, the principle of least radix economy, is formulated. The mathematical and physical relevance of the radix economy, also called digit capacity, is established, showing how physical laws can be derived from this concept in a unified way. The principle reinterprets and generalizes the principle of least action yielding two classes of physical solutions: least action paths and quantum wavefunctions. A new physical foundation of the Hilbert space of quantum mechanics is then accomplished and it is used to derive the Schrödinger and Dirac equations and the breaking of the commutativity of spacetime geometry. The formulation provides an explanation of how determinism and random statistical behavior coexist in spacetime and a framework is developed that allows dynamical processes to be formulated in terms of chains of digits. These methods lead to a new (pre-geometrical) foundation for Lorentz transformations and special relativity. The Parker-Rhodes combinatorial hierarchy is encompassed within our approach and this leads to an estimate of the interaction strength of the electromagnetic and gravitational forces that agrees with the experimental values to an error of less than one thousandth. Finally, it is shown how the principle of least-radix economy naturally gives rise to Boltzmann's principle of classical statistical thermodynamics. A new expression for a general (path-dependent) nonequilibrium entropy is proposed satisfying the Second Law of Thermodynamics.

  11. eLearning course may shorten the duration of mechanical restraint among psychiatric inpatients: a cluster-randomized trial.

    PubMed

    Kontio, Raija; Pitkänen, Anneli; Joffe, Grigori; Katajisto, Jouko; Välimäki, Maritta

    2014-10-01

    The management of psychiatric inpatients exhibiting severely disturbed and aggressive behaviour is an important educational topic. Well structured, IT-based educational programmes (eLearning) often ensure quality and may make training more affordable and accessible. The aim of this study was to explore the impact of an eLearning course for personnel on the rates and duration of seclusion and mechanical restraint among psychiatric inpatients. In a cluster-randomized intervention trial, the nursing personnel on 10 wards were randomly assigned to eLearning (intervention) or training-as-usual (control) groups. The eLearning course comprised six modules with specific topics (legal and ethical issues, behaviour-related factors, therapeutic relationship and self-awareness, teamwork and integrating knowledge with practice) and specific learning methods. The rates (incidents per 1000 occupied bed days) and durations of the coercion incidents were examined before and after the course. A total of 1283 coercion incidents (1143 seclusions [89%] and 140 incidents involving the use of mechanical restraints [11%]) were recorded on the study wards during the data collection period. On the intervention wards, there were no statistically significant changes in the rates of seclusion and mechanical restraint. However, the duration of incidents involving mechanical restraints shortened from 36.0 to 4.0 h (median) (P < 0.001). No statistically significant changes occurred on the control wards. After our eLearning course, the duration of incidents involving the use of mechanical restraints decreased. However, more studies are needed to ensure that the content of the course focuses on the most important factors associated with the seclusion-related elements. The eLearning course deserves further development and further studies. The duration of coercion incidents merits attention in future research.

  12. Outcomes of early carotid stenting and angioplasty in large-vessel anterior circulation strokes treated with mechanical thrombectomy and intravenous thrombolytics.

    PubMed

    Mehta, T; Desai, N; Mehta, K; Parikh, R; Male, S; Hussain, M; Ollenschleger, M; Spiegel, G; Grande, A; Ezzeddine, M; Jagadeesan, B; Tummala, R; McCullough, L

    2018-01-01

    Introduction Proximal cervical internal carotid artery stenosis greater than 50% merits revascularization to mitigate the risk of stroke recurrence among large-vessel anterior circulation strokes undergoing mechanical thrombectomy. Carotid artery stenting necessitates the use of antiplatelets, and there is a theoretical increased risk of hemorrhagic transformation given that such patients may already have received intravenous thrombolytics and have a significant infarct burden. We investigate the outcomes of large-vessel anterior circulation stroke patients treated with intravenous thrombolytics receiving same-day carotid stenting or selective angioplasty compared to no carotid intervention. Materials and methods The study cohort was obtained from the National (Nationwide) Inpatient Sample database between 2006 and 2014, using International Statistical Classification of Diseases, ninth revision discharge diagnosis and procedure codes. A total of 11,825 patients with large-vessel anterior circulation stroke treated with intravenous thrombolytic and mechanical thrombectomy on the same day were identified. The study population was subdivided into three subgroups: no carotid intervention, same-day carotid angioplasty without carotid stenting, and same-day carotid stenting. Outcomes were assessed with respect to mortality, significant disability at discharge, hemorrhagic transformation, and requirement of percutaneous endoscopic gastronomy tube placement, prolonged mechanical ventilation, or craniotomy. Results This study found no statistically significant difference in patient outcomes in those treated with concurrent carotid stenting compared to no carotid intervention in terms of morbidity or mortality. Conclusions If indicated, it is reasonable to consider concurrent carotid stenting and/or angioplasty for large-vessel anterior circulation stroke patients treated with mechanical thrombectomy who also receive intravenous thrombolytics.

  13. A novel chi-square statistic for detecting group differences between pathways in systems epidemiology.

    PubMed

    Yuan, Zhongshang; Ji, Jiadong; Zhang, Tao; Liu, Yi; Zhang, Xiaoshuai; Chen, Wei; Xue, Fuzhong

    2016-12-20

    Traditional epidemiology often pays more attention to the identification of a single factor rather than to the pathway that is related to a disease, and therefore, it is difficult to explore the disease mechanism. Systems epidemiology aims to integrate putative lifestyle exposures and biomarkers extracted from multiple omics platforms to offer new insights into the pathway mechanisms that underlie disease at the human population level. One key but inadequately addressed question is how to develop powerful statistics to identify whether one candidate pathway is associated with a disease. Bearing in mind that a pathway difference can result from not only changes in the nodes but also changes in the edges, we propose a novel statistic for detecting group differences between pathways, which in principle, captures the nodes changes and edge changes, as well as simultaneously accounting for the pathway structure simultaneously. The proposed test has been proven to follow the chi-square distribution, and various simulations have shown it has better performance than other existing methods. Integrating genome-wide DNA methylation data, we analyzed one real data set from the Bogalusa cohort study and significantly identified a potential pathway, Smoking → SOCS3 → PIK3R1, which was strongly associated with abdominal obesity. The proposed test was powerful and efficient at identifying pathway differences between two groups, and it can be extended to other disciplines that involve statistical comparisons between pathways. The source code in R is available on our website. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Statistical mechanics of multipartite entanglement

    NASA Astrophysics Data System (ADS)

    Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.

    2009-02-01

    We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over all balanced bipartitions. We search for those (maximally multipartite entangled) states whose purity is minimum for all bipartitions and recast this optimization problem into a problem of statistical mechanics.

  15. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  16. Orbital Roof Fractures as an Indicator for Concomitant Ocular Injury

    DTIC Science & Technology

    2017-11-12

    between these two groups to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and exam findings, and...using Pearson’s x2 test or Fisher’s exact test to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and

  17. Epithelial apoptosis in mechanistically distinct methods of injury in the murine small intestine

    PubMed Central

    Vyas, Dinesh; Robertson, Charles M; Stromberg, Paul E; Martin, James R.; Dunne, W. Michael; Houchen, Courtney W; Barrett, Terrence A; Ayala, Alfred; Perl, Mario; Buchman, Timothy G; Coopersmith, Craig M

    2007-01-01

    Gut epithelial apoptosis is involved in the pathophysiology of multiple diseases. This study characterized intestinal apoptosis in three mechanistically distinct injuries with different kinetics of cell death. FVB/N mice were subjected to gamma radiation, Pseudomonas aeruginosa pneumonia or injection of monoclonal anti-CD3 antibody and sacrificed 4, 12, or 24 hours post-injury (n=10/time point). Apoptosis was quantified in the jejunum by hematoxylin and eosin (H&E), active caspase-3, terminal deoxynucleotidyl transferase dUTP-mediated nick end labeling (TUNEL), in situ oligoligation reaction (ISOL,) cytokeratin 18, and annexin V staining. Reproducible results were obtained only for H&E, active caspase-3, TUNEL and ISOL, which were quantified and compared against each other for each injury at each time point. Kinetics of injury were different with early apoptosis highest following radiation, late apoptosis highest following anti CD3, and more consistent levels following pneumonia. ISOL was the most consistent stain and was always statistically indistinguishable from at least 2 stains. In contrast, active caspase-3 demonstrated lower levels of apoptosis, while the TUNEL assay had higher levels of apoptosis in the most severely injured intestine regardless of mechanism of injury. H&E was a statistical outlier more commonly than any other stain. This suggests that regardless of mechanism or kinetics of injury, ISOL correlates to other quantification methods of detecting gut epithelial apoptosis more than any other method studied and compares favorably to other commonly accepted techniques of quantifying apoptosis in a large intestinal cross sectional by balancing sensitivity and specificity across a range of times and levels of death. PMID:17357092

  18. Maximum caliber inference of nonequilibrium processes

    NASA Astrophysics Data System (ADS)

    Otten, Moritz; Stock, Gerhard

    2010-07-01

    Thirty years ago, Jaynes suggested a general theoretical approach to nonequilibrium statistical mechanics, called maximum caliber (MaxCal) [Annu. Rev. Phys. Chem. 31, 579 (1980)]. MaxCal is a variational principle for dynamics in the same spirit that maximum entropy is a variational principle for equilibrium statistical mechanics. Motivated by the success of maximum entropy inference methods for equilibrium problems, in this work the MaxCal formulation is applied to the inference of nonequilibrium processes. That is, given some time-dependent observables of a dynamical process, one constructs a model that reproduces these input data and moreover, predicts the underlying dynamics of the system. For example, the observables could be some time-resolved measurements of the folding of a protein, which are described by a few-state model of the free energy landscape of the system. MaxCal then calculates the probabilities of an ensemble of trajectories such that on average the data are reproduced. From this probability distribution, any dynamical quantity of the system can be calculated, including population probabilities, fluxes, or waiting time distributions. After briefly reviewing the formalism, the practical numerical implementation of MaxCal in the case of an inference problem is discussed. Adopting various few-state models of increasing complexity, it is demonstrated that the MaxCal principle indeed works as a practical method of inference: The scheme is fairly robust and yields correct results as long as the input data are sufficient. As the method is unbiased and general, it can deal with any kind of time dependency such as oscillatory transients and multitime decays.

  19. [Correlation coefficient-based principle and method for the classification of jump degree in hydrological time series].

    PubMed

    Wu, Zi Yi; Xie, Ping; Sang, Yan Fang; Gu, Hai Ting

    2018-04-01

    The phenomenon of jump is one of the importantly external forms of hydrological variabi-lity under environmental changes, representing the adaption of hydrological nonlinear systems to the influence of external disturbances. Presently, the related studies mainly focus on the methods for identifying the jump positions and jump times in hydrological time series. In contrast, few studies have focused on the quantitative description and classification of jump degree in hydrological time series, which make it difficult to understand the environmental changes and evaluate its potential impacts. Here, we proposed a theatrically reliable and easy-to-apply method for the classification of jump degree in hydrological time series, using the correlation coefficient as a basic index. The statistical tests verified the accuracy, reasonability, and applicability of this method. The relationship between the correlation coefficient and the jump degree of series were described using mathematical equation by derivation. After that, several thresholds of correlation coefficients under different statistical significance levels were chosen, based on which the jump degree could be classified into five levels: no, weak, moderate, strong and very strong. Finally, our method was applied to five diffe-rent observed hydrological time series, with diverse geographic and hydrological conditions in China. The results of the classification of jump degrees in those series were closely accorded with their physically hydrological mechanisms, indicating the practicability of our method.

  20. Advancing a Model-Validated Statistical Method for Decomposing the Key Oceanic Drivers of Regional Climate: Focus on Northern and Tropical African Climate Variability in the Community Earth System Model (CESM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Fuyao; Yu, Yan; Notaro, Michael

    This study advances the practicality and stability of the traditional multivariate statistical method, generalized equilibrium feedback assessment (GEFA), for decomposing the key oceanic drivers of regional atmospheric variability, especially when available data records are short. An advanced stepwise GEFA methodology is introduced, in which unimportant forcings within the forcing matrix are eliminated through stepwise selection. Method validation of stepwise GEFA is performed using the CESM, with a focused application to northern and tropical Africa (NTA). First, a statistical assessment of the atmospheric response to each primary oceanic forcing is carried out by applying stepwise GEFA to a fully coupled controlmore » run. Then, a dynamical assessment of the atmospheric response to individual oceanic forcings is performed through ensemble experiments by imposing sea surface temperature anomalies over focal ocean basins. Finally, to quantify the reliability of stepwise GEFA, the statistical assessment is evaluated against the dynamical assessment in terms of four metrics: the percentage of grid cells with consistent response sign, the spatial correlation of atmospheric response patterns, the area-averaged seasonal cycle of response magnitude, and consistency in associated mechanisms between assessments. In CESM, tropical modes, namely El Niño–Southern Oscillation and the tropical Indian Ocean Basin, tropical Indian Ocean dipole, and tropical Atlantic Niño modes, are the dominant oceanic controls of NTA climate. In complementary studies, stepwise GEFA is validated in terms of isolating terrestrial forcings on the atmosphere, and observed oceanic and terrestrial drivers of NTA climate are extracted to establish an observational benchmark for subsequent coupled model evaluation and development of process-based weights for regional climate projections.« less

  1. Advancing a Model-Validated Statistical Method for Decomposing the Key Oceanic Drivers of Regional Climate: Focus on Northern and Tropical African Climate Variability in the Community Earth System Model (CESM)

    DOE PAGES

    Wang, Fuyao; Yu, Yan; Notaro, Michael; ...

    2017-09-27

    This study advances the practicality and stability of the traditional multivariate statistical method, generalized equilibrium feedback assessment (GEFA), for decomposing the key oceanic drivers of regional atmospheric variability, especially when available data records are short. An advanced stepwise GEFA methodology is introduced, in which unimportant forcings within the forcing matrix are eliminated through stepwise selection. Method validation of stepwise GEFA is performed using the CESM, with a focused application to northern and tropical Africa (NTA). First, a statistical assessment of the atmospheric response to each primary oceanic forcing is carried out by applying stepwise GEFA to a fully coupled controlmore » run. Then, a dynamical assessment of the atmospheric response to individual oceanic forcings is performed through ensemble experiments by imposing sea surface temperature anomalies over focal ocean basins. Finally, to quantify the reliability of stepwise GEFA, the statistical assessment is evaluated against the dynamical assessment in terms of four metrics: the percentage of grid cells with consistent response sign, the spatial correlation of atmospheric response patterns, the area-averaged seasonal cycle of response magnitude, and consistency in associated mechanisms between assessments. In CESM, tropical modes, namely El Niño–Southern Oscillation and the tropical Indian Ocean Basin, tropical Indian Ocean dipole, and tropical Atlantic Niño modes, are the dominant oceanic controls of NTA climate. In complementary studies, stepwise GEFA is validated in terms of isolating terrestrial forcings on the atmosphere, and observed oceanic and terrestrial drivers of NTA climate are extracted to establish an observational benchmark for subsequent coupled model evaluation and development of process-based weights for regional climate projections.« less

  2. An injury mortality prediction based on the anatomic injury scale

    PubMed Central

    Wang, Muding; Wu, Dan; Qiu, Wusi; Wang, Weimi; Zeng, Yunji; Shen, Yi

    2017-01-01

    Abstract To determine whether the injury mortality prediction (IMP) statistically outperforms the trauma mortality prediction model (TMPM) as a predictor of mortality. The TMPM is currently the best trauma score method, which is based on the anatomic injury. Its ability of mortality prediction is superior to the injury severity score (ISS) and to the new injury severity score (NISS). However, despite its statistical significance, the predictive power of TMPM needs to be further improved. Retrospective cohort study is based on the data of 1,148,359 injured patients in the National Trauma Data Bank hospitalized from 2010 to 2011. Sixty percent of the data was used to derive an empiric measure of severity of different Abbreviated Injury Scale predot codes by taking the weighted average death probabilities of trauma patients. Twenty percent of the data was used to create computing method of the IMP model. The remaining 20% of the data was used to evaluate the statistical performance of IMP and then be compared with the TMPM and the single worst injury by examining area under the receiver operating characteristic curve (ROC), the Hosmer–Lemeshow (HL) statistic, and the Akaike information criterion. IMP exhibits significantly both better discrimination (ROC-IMP, 0.903 [0.899–0.907] and ROC-TMPM, 0.890 [0.886–0.895]) and calibration (HL-IMP, 9.9 [4.4–14.7] and HL-TMPM, 197 [143–248]) compared with TMPM. All models show slight changes after the extension of age, gender, and mechanism of injury, but the extended IMP still dominated TMPM in every performance. The IMP has slight improvement in discrimination and calibration compared with the TMPM and can accurately predict mortality. Therefore, we consider it as a new feasible scoring method in trauma research. PMID:28858124

  3. Functional Logistic Regression Approach to Detecting Gene by Longitudinal Environmental Exposure Interaction in a Case-Control Study

    PubMed Central

    Wei, Peng; Tang, Hongwei; Li, Donghui

    2014-01-01

    Most complex human diseases are likely the consequence of the joint actions of genetic and environmental factors. Identification of gene-environment (GxE) interactions not only contributes to a better understanding of the disease mechanisms, but also improves disease risk prediction and targeted intervention. In contrast to the large number of genetic susceptibility loci discovered by genome-wide association studies, there have been very few successes in identifying GxE interactions which may be partly due to limited statistical power and inaccurately measured exposures. While existing statistical methods only consider interactions between genes and static environmental exposures, many environmental/lifestyle factors, such as air pollution and diet, change over time, and cannot be accurately captured at one measurement time point or by simply categorizing into static exposure categories. There is a dearth of statistical methods for detecting gene by time-varying environmental exposure interactions. Here we propose a powerful functional logistic regression (FLR) approach to model the time-varying effect of longitudinal environmental exposure and its interaction with genetic factors on disease risk. Capitalizing on the powerful functional data analysis framework, our proposed FLR model is capable of accommodating longitudinal exposures measured at irregular time points and contaminated by measurement errors, commonly encountered in observational studies. We use extensive simulations to show that the proposed method can control the Type I error and is more powerful than alternative ad hoc methods. We demonstrate the utility of this new method using data from a case-control study of pancreatic cancer to identify the windows of vulnerability of lifetime body mass index on the risk of pancreatic cancer as well as genes which may modify this association. PMID:25219575

  4. Functional logistic regression approach to detecting gene by longitudinal environmental exposure interaction in a case-control study.

    PubMed

    Wei, Peng; Tang, Hongwei; Li, Donghui

    2014-11-01

    Most complex human diseases are likely the consequence of the joint actions of genetic and environmental factors. Identification of gene-environment (G × E) interactions not only contributes to a better understanding of the disease mechanisms, but also improves disease risk prediction and targeted intervention. In contrast to the large number of genetic susceptibility loci discovered by genome-wide association studies, there have been very few successes in identifying G × E interactions, which may be partly due to limited statistical power and inaccurately measured exposures. Although existing statistical methods only consider interactions between genes and static environmental exposures, many environmental/lifestyle factors, such as air pollution and diet, change over time, and cannot be accurately captured at one measurement time point or by simply categorizing into static exposure categories. There is a dearth of statistical methods for detecting gene by time-varying environmental exposure interactions. Here, we propose a powerful functional logistic regression (FLR) approach to model the time-varying effect of longitudinal environmental exposure and its interaction with genetic factors on disease risk. Capitalizing on the powerful functional data analysis framework, our proposed FLR model is capable of accommodating longitudinal exposures measured at irregular time points and contaminated by measurement errors, commonly encountered in observational studies. We use extensive simulations to show that the proposed method can control the Type I error and is more powerful than alternative ad hoc methods. We demonstrate the utility of this new method using data from a case-control study of pancreatic cancer to identify the windows of vulnerability of lifetime body mass index on the risk of pancreatic cancer as well as genes that may modify this association. © 2014 Wiley Periodicals, Inc.

  5. THEORETICAL AND EXPERIMENTAL ASPECTS OF ISOTOPIC FRACTIONATION.

    USGS Publications Warehouse

    O'Neil, James R.

    1986-01-01

    Essential to the interpretation of natural variations of light stable isotope ratios is knowledge of the magnitude and temperature dependence of isotopic fractionation factors between the common minerals and fluids. These fractionation factors are obtained in three ways: (1) Semi-empirical calculations using spectroscopic data and the methods of statistical mechanics. (2) Laboratory calibration studies. (3) Measurements of natural samples whose formation conditions are well-known or highly constrained. In this chapter methods (1) and (2) are evaluated and a review is given of the present state of knowledge of the theory of isotopic fractionation and the fraction that influence the isotopic properties of minerals.

  6. Exploring the temperature dependence of failure mechanisms in fragmenting metal cylinders

    NASA Astrophysics Data System (ADS)

    Jones, David; Chapman, David; Hazell, Paul; Bland, Simon; Eakins, Daniel

    2011-06-01

    We present current work to investigate the influence of temperature on the dynamic fragmentation of metals. Pre-heated/cooled cylinders of Ti-6Al-4V were subjected to rapid radial expansion up to and past the point of failure using a modified expanding insert method on a single stage gas gun. Additional experiments were performed using an electromagnetic drive system to produce uniform deformations on targets of differing dimensions (radius, wall thickness). Issues concerning the geometry of the experiments, methods of heating and cooling the sample and diagnostics are covered. Finally, the role of temperature on adiabatic shear banding and fragment distribution statistics is discussed.

  7. External trial deep brain stimulation device for the application of desynchronizing stimulation techniques.

    PubMed

    Hauptmann, C; Roulet, J-C; Niederhauser, J J; Döll, W; Kirlangic, M E; Lysyansky, B; Krachkovskyi, V; Bhatti, M A; Barnikol, U B; Sasse, L; Bührle, C P; Speckmann, E-J; Götz, M; Sturm, V; Freund, H-J; Schnell, U; Tass, P A

    2009-12-01

    In the past decade deep brain stimulation (DBS)-the application of electrical stimulation to specific target structures via implanted depth electrodes-has become the standard treatment for medically refractory Parkinson's disease and essential tremor. These diseases are characterized by pathological synchronized neuronal activity in particular brain areas. We present an external trial DBS device capable of administering effectively desynchronizing stimulation techniques developed with methods from nonlinear dynamics and statistical physics according to a model-based approach. These techniques exploit either stochastic phase resetting principles or complex delayed-feedback mechanisms. We explain how these methods are implemented into a safe and user-friendly device.

  8. Quantum mechanics of black holes.

    PubMed

    Witten, Edward

    2012-08-03

    The popular conception of black holes reflects the behavior of the massive black holes found by astronomers and described by classical general relativity. These objects swallow up whatever comes near and emit nothing. Physicists who have tried to understand the behavior of black holes from a quantum mechanical point of view, however, have arrived at quite a different picture. The difference is analogous to the difference between thermodynamics and statistical mechanics. The thermodynamic description is a good approximation for a macroscopic system, but statistical mechanics describes what one will see if one looks more closely.

  9. A Revelation: Quantum-Statistics and Classical-Statistics are Analytic-Geometry Conic-Sections and Numbers/Functions: Euler, Riemann, Bernoulli Generating-Functions: Conics to Numbers/Functions Deep Subtle Connections

    NASA Astrophysics Data System (ADS)

    Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!

  10. An approach for the assessment of the statistical aspects of the SEA coupling loss factors and the vibrational energy transmission in complex aircraft structures: Experimental investigation and methods benchmark

    NASA Astrophysics Data System (ADS)

    Bouhaj, M.; von Estorff, O.; Peiffer, A.

    2017-09-01

    In the application of Statistical Energy Analysis "SEA" to complex assembled structures, a purely predictive model often exhibits errors. These errors are mainly due to a lack of accurate modelling of the power transmission mechanism described through the Coupling Loss Factors (CLF). Experimental SEA (ESEA) is practically used by the automotive and aerospace industry to verify and update the model or to derive the CLFs for use in an SEA predictive model when analytical estimates cannot be made. This work is particularly motivated by the lack of procedures that allow an estimate to be made of the variance and confidence intervals of the statistical quantities when using the ESEA technique. The aim of this paper is to introduce procedures enabling a statistical description of measured power input, vibration energies and the derived SEA parameters. Particular emphasis is placed on the identification of structural CLFs of complex built-up structures comparing different methods. By adopting a Stochastic Energy Model (SEM), the ensemble average in ESEA is also addressed. For this purpose, expressions are obtained to randomly perturb the energy matrix elements and generate individual samples for the Monte Carlo (MC) technique applied to derive the ensemble averaged CLF. From results of ESEA tests conducted on an aircraft fuselage section, the SEM approach provides a better performance of estimated CLFs compared to classical matrix inversion methods. The expected range of CLF values and the synthesized energy are used as quality criteria of the matrix inversion, allowing to assess critical SEA subsystems, which might require a more refined statistical description of the excitation and the response fields. Moreover, the impact of the variance of the normalized vibration energy on uncertainty of the derived CLFs is outlined.

  11. Effectiveness of adjunctive subgingival administration of amino acids and sodium hyaluronate gel on clinical and immunological parameters in the treatment of chronic periodontitis

    PubMed Central

    Bevilacqua, Lorenzo; Eriani, Jessica; Serroni, Ilde; Liani, Giuliana; Borelli, Violetta; Castronovo, Gaetano; Di Lenarda, Roberto

    2012-01-01

    Summary Aims The aim of this clinical trial was to compare clinical and biochemical healing outcomes following ultrasonic mechanical instrumentation versus ultrasonic mechanical instrumentation associated with topical subgingival application of amino acids and sodium hyaluronate gel. Methods Eleven systemically healthy subjects with moderate-severe chronic periodontitis, who had four sites with pocket probing depth and clinical attachment level greater than or equal to 5 mm were randomly assigned to two different types of treatment: two pockets were treated with ultrasonic debridement (Control Group) and two pockets with ultrasonic mechanical instrumentation associated with 0,5 ml of amino acids and sodium hyaluronate gel (Test Group). Probing depth, clinical attachment level, plaque index and bleeding on probing were recorded at baseline, 45 and 90 days. Levels of calprotectin and myeloperoxidase activity in gingival crevicular fluid were assessed at baseline and on day 7 and 45. Results Statistical significance was found between baseline and day 45 in relation to probing depth reduction and bleeding on probing between groups for both of the tested treatments. Significant reductions in μg/sample of calprotectin and myeloperoxidase were found after 1-week and an increase at 45 days in both groups. There were no statistically significant differences between other variables evaluated in this study. Conclusions These data suggest that subgingival application of hyaluronic acid following ultrasonic mechanical instrumentation is beneficial for improving periodontal parameters. PMID:23087790

  12. ZERODUR strength modeling with Weibull statistical distributions

    NASA Astrophysics Data System (ADS)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a large data set. With only 20 specimens per sample such differentiation is not possible. This requires 100 specimens per set, the more the better. The validity of the statistical evaluation methods is discussed with several examples. These considerations are of special importance because of their consequences on the prognosis methods and results. Especially the use of the two parameter Weibull distribution for high strength surfaces has led to non-realistic results. Extrapolation down to low acceptable probability of failure covers a wide range without data points existing and is mainly influenced by the slope determined by the high strength specimens. In the past this misconception has prevented the use of brittle materials for stress loads, which they could have endured easily.

  13. Body Weight Reducing Effect of Oral Boric Acid Intake

    PubMed Central

    Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut

    2011-01-01

    Background: Boric acid is widely used in biology, but its body weight reducing effect is not researched. Methods: Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Results: Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Conclusion: Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear. PMID:22135611

  14. Statistical summaries of fatigue data for design purposes

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1983-01-01

    Two methods are discussed for constructing a design curve on the safe side of fatigue data. Both the tolerance interval and equivalent prediction interval (EPI) concepts provide such a curve while accounting for both the distribution of the estimators in small samples and the data scatter. The EPI is also useful as a mechanism for providing necessary statistics on S-N data for a full reliability analysis which includes uncertainty in all fatigue design factors. Examples of statistical analyses of the general strain life relationship are presented. The tolerance limit and EPI techniques for defining a design curve are demonstrated. Examples usng WASPALOY B and RQC-100 data demonstrate that a reliability model could be constructed by considering the fatigue strength and fatigue ductility coefficients as two independent random variables. A technique given for establishing the fatigue strength for high cycle lives relies on an extrapolation technique and also accounts for "runners." A reliability model or design value can be specified.

  15. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.

  16. Statistical Mechanics of Prion Diseases

    NASA Astrophysics Data System (ADS)

    Slepoy, A.; Singh, R. R.; Pázmándi, F.; Kulkarni, R. V.; Cox, D. L.

    2001-07-01

    We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ``species barriers'' to prion infection and assess a related treatment protocol.

  17. Quarks, Symmetries and Strings - a Symposium in Honor of Bunji Sakita's 60th Birthday

    NASA Astrophysics Data System (ADS)

    Kaku, M.; Jevicki, A.; Kikkawa, K.

    1991-04-01

    The Table of Contents for the full book PDF is as follows: * Preface * Evening Banquet Speech * I. Quarks and Phenomenology * From the SU(6) Model to Uniqueness in the Standard Model * A Model for Higgs Mechanism in the Standard Model * Quark Mass Generation in QCD * Neutrino Masses in the Standard Model * Solar Neutrino Puzzle, Horizontal Symmetry of Electroweak Interactions and Fermion Mass Hierarchies * State of Chiral Symmetry Breaking at High Temperatures * Approximate |ΔI| = 1/2 Rule from a Perspective of Light-Cone Frame Physics * Positronium (and Some Other Systems) in a Strong Magnetic Field * Bosonic Technicolor and the Flavor Problem * II. Strings * Supersymmetry in String Theory * Collective Field Theory and Schwinger-Dyson Equations in Matrix Models * Non-Perturbative String Theory * The Structure of Non-Perturbative Quantum Gravity in One and Two Dimensions * Noncritical Virasoro Algebra of d < 1 Matrix Model and Quantized String Field * Chaos in Matrix Models ? * On the Non-Commutative Symmetry of Quantum Gravity in Two Dimensions * Matrix Model Formulation of String Field Theory in One Dimension * Geometry of the N = 2 String Theory * Modular Invariance form Gauge Invariance in the Non-Polynomial String Field Theory * Stringy Symmetry and Off-Shell Ward Identities * q-Virasoro Algebra and q-Strings * Self-Tuning Fields and Resonant Correlations in 2d-Gravity * III. Field Theory Methods * Linear Momentum and Angular Momentum in Quaternionic Quantum Mechanics * Some Comments on Real Clifford Algebras * On the Quantum Group p-adics Connection * Gravitational Instantons Revisited * A Generalized BBGKY Hierarchy from the Classical Path-Integral * A Quantum Generated Symmetry: Group-Level Duality in Conformal and Topological Field Theory * Gauge Symmetries in Extended Objects * Hidden BRST Symmetry and Collective Coordinates * Towards Stochastically Quantizing Topological Actions * IV. Statistical Methods * A Brief Summary of the s-Channel Theory of Superconductivity * Neural Networks and Models for the Brain * Relativistic One-Body Equations for Planar Particles with Arbitrary Spin * Chiral Property of Quarks and Hadron Spectrum in Lattice QCD * Scalar Lattice QCD * Semi-Superconductivity of a Charged Anyon Gas * Two-Fermion Theory of Strongly Correlated Electrons and Charge-Spin Separation * Statistical Mechanics and Error-Correcting Codes * Quantum Statistics

  18. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  19. Uniting statistical and individual-based approaches for animal movement modelling.

    PubMed

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  20. Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling

    PubMed Central

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047

  1. Modelling multiple sources of dissemination bias in meta-analysis.

    PubMed

    Bowden, Jack; Jackson, Dan; Thompson, Simon G

    2010-03-30

    Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.

  2. Flexible kinematic earthquake rupture inversion of tele-seismic waveforms: Application to the 2013 Balochistan, Pakistan earthquake

    NASA Astrophysics Data System (ADS)

    Shimizu, K.; Yagi, Y.; Okuwaki, R.; Kasahara, A.

    2017-12-01

    The kinematic earthquake rupture models are useful to derive statistics and scaling properties of the large and great earthquakes. However, the kinematic rupture models for the same earthquake are often different from one another. Such sensitivity of the modeling prevents us to understand the statistics and scaling properties of the earthquakes. Yagi and Fukahata (2011) introduces the uncertainty of Green's function into the tele-seismic waveform inversion, and shows that the stable spatiotemporal distribution of slip-rate can be obtained by using an empirical Bayesian scheme. One of the unsolved problems in the inversion rises from the modeling error originated from an uncertainty of a fault-model setting. Green's function near the nodal plane of focal mechanism is known to be sensitive to the slight change of the assumed fault geometry, and thus the spatiotemporal distribution of slip-rate should be distorted by the modeling error originated from the uncertainty of the fault model. We propose a new method accounting for the complexity in the fault geometry by additionally solving the focal mechanism on each space knot. Since a solution of finite source inversion gets unstable with an increasing of flexibility of the model, we try to estimate a stable spatiotemporal distribution of focal mechanism in the framework of Yagi and Fukahata (2011). We applied the proposed method to the 52 tele-seismic P-waveforms of the 2013 Balochistan, Pakistan earthquake. The inverted-potency distribution shows unilateral rupture propagation toward southwest of the epicenter, and the spatial variation of the focal mechanisms shares the same pattern as the fault-curvature along the tectonic fabric. On the other hand, the broad pattern of rupture process, including the direction of rupture propagation, cannot be reproduced by an inversion analysis under the assumption that the faulting occurred on a single flat plane. These results show that the modeling error caused by simplifying the fault model is non-negligible in the tele-seismic waveform inversion of the 2013 Balochistan, Pakistan earthquake.

  3. A primer on the study of transitory dynamics in ecological series using the scale-dependent correlation analysis.

    PubMed

    Rodríguez-Arias, Miquel Angel; Rodó, Xavier

    2004-03-01

    Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.

  4. Statistical learning using real-world scenes: extracting categorical regularities without conscious intent.

    PubMed

    Brady, Timothy F; Oliva, Aude

    2008-07-01

    Recent work has shown that observers can parse streams of syllables, tones, or visual shapes and learn statistical regularities in them without conscious intent (e.g., learn that A is always followed by B). Here, we demonstrate that these statistical-learning mechanisms can operate at an abstract, conceptual level. In Experiments 1 and 2, observers incidentally learned which semantic categories of natural scenes covaried (e.g., kitchen scenes were always followed by forest scenes). In Experiments 3 and 4, category learning with images of scenes transferred to words that represented the categories. In each experiment, the category of the scenes was irrelevant to the task. Together, these results suggest that statistical-learning mechanisms can operate at a categorical level, enabling generalization of learned regularities using existing conceptual knowledge. Such mechanisms may guide learning in domains as disparate as the acquisition of causal knowledge and the development of cognitive maps from environmental exploration.

  5. Infinite-mode squeezed coherent states and non-equilibrium statistical mechanics (phase-space-picture approach)

    NASA Technical Reports Server (NTRS)

    Yeh, Leehwa

    1993-01-01

    The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite-mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena.

  6. Hitting Is Contagious in Baseball: Evidence from Long Hitting Streaks

    PubMed Central

    Bock, Joel R.; Maewal, Akhilesh; Gough, David A.

    2012-01-01

    Data analysis is used to test the hypothesis that “hitting is contagious”. A statistical model is described to study the effect of a hot hitter upon his teammates’ batting during a consecutive game hitting streak. Box score data for entire seasons comprising streaks of length games, including a total observations were compiled. Treatment and control sample groups () were constructed from core lineups of players on the streaking batter’s team. The percentile method bootstrap was used to calculate confidence intervals for statistics representing differences in the mean distributions of two batting statistics between groups. Batters in the treatment group (hot streak active) showed statistically significant improvements in hitting performance, as compared against the control. Mean for the treatment group was found to be to percentage points higher during hot streaks (mean difference increased points), while the batting heat index introduced here was observed to increase by points. For each performance statistic, the null hypothesis was rejected at the significance level. We conclude that the evidence suggests the potential existence of a “statistical contagion effect”. Psychological mechanisms essential to the empirical results are suggested, as several studies from the scientific literature lend credence to contagious phenomena in sports. Causal inference from these results is difficult, but we suggest and discuss several latent variables that may contribute to the observed results, and offer possible directions for future research. PMID:23251507

  7. Inferring explicit weighted consensus networks to represent alternative evolutionary histories

    PubMed Central

    2013-01-01

    Background The advent of molecular biology techniques and constant increase in availability of genetic material have triggered the development of many phylogenetic tree inference methods. However, several reticulate evolution processes, such as horizontal gene transfer and hybridization, have been shown to blur the species evolutionary history by causing discordance among phylogenies inferred from different genes. Methods To tackle this problem, we hereby describe a new method for inferring and representing alternative (reticulate) evolutionary histories of species as an explicit weighted consensus network which can be constructed from a collection of gene trees with or without prior knowledge of the species phylogeny. Results We provide a way of building a weighted phylogenetic network for each of the following reticulation mechanisms: diploid hybridization, intragenic recombination and complete or partial horizontal gene transfer. We successfully tested our method on some synthetic and real datasets to infer the above-mentioned evolutionary events which may have influenced the evolution of many species. Conclusions Our weighted consensus network inference method allows one to infer, visualize and validate statistically major conflicting signals induced by the mechanisms of reticulate evolution. The results provided by the new method can be used to represent the inferred conflicting signals by means of explicit and easy-to-interpret phylogenetic networks. PMID:24359207

  8. Exploring Religious Mechanisms for Healthy Alcohol Use: Religious Messages and Drinking Among Korean Women in California*

    PubMed Central

    Ayers, John W.; Hofstetter, C. Richard; Hughes, Suzanne C.; Irvin, Veronica L.; Kang Sim, D. Eastern; Hovell, Melbourne F.

    2009-01-01

    Objective: This research identifies social reinforcers within religious institutions associated with alcohol consumption among Korean women in California. Method: Data were drawn from telephone interviews with female adults (N = 591) selected from a random sampling of persons in California with Korean surnames during 2007. Approximately 70% of attempted interviews were completed, with 92% conducted in Korean. Respondents were asked about any lifetime drinking (yes/no), drinking rate (typical number of drinks consumed on drinking days among current drinkers), and messages discouraging “excessive drinking” from religious leaders or congregants. Bivariable and multivariable regressions were used for analysis. Results: Approximately 70.4% of women reported any lifetime drinking, and drinkers drank a mean (SD) of 1.10 (1.22) drinks on drinking days. About 30.8%reported about 30.8% reported any exposure to religious leaders' messages discouraging excessive drinking, and 28.2% reported any exposure to similar messages from congregants. Each congregant's message was statistically significantly associated with a 5.1% lower probability (odds ratio = 0.775, 95% confidence interval [CI]: 0.626, 0.959) of any lifetime drinking. Also, each congregant's message was associated with a 13.8% (B = -0.138; 95% CI: -0.306, 0.029) lower drinking rate, which was statistically significant after adjusting for covariates using a one-tailed test. Exposure to leaders' messages was not statistically significantly associated with any lifetime drinking or drinking rate. Conclusions: Social reinforcement in the form of religious messages may be one mechanism by which religious institutions influence drinking behaviors. For Korean women, messages from congregants had a unique impact beyond the traditional religiosity indicators. These social mechanisms provide public health interventionists with religious pathways to improve drinking behaviors. PMID:19895765

  9. Improved QM Methods and Their Application in QM/MM Studies of Enzymatic Reactions

    NASA Astrophysics Data System (ADS)

    Jorgensen, William L.

    2007-03-01

    Quantum mechanics (QM) and Monte Carlo statistical mechanics (MC) simulations have been used by us since the early 1980s to study reaction mechanisms and the origin of solvent effects on reaction rates. A goal was always to perform the QM and MC/MM calculations simultaneously in order to obtain free-energy surfaces in solution with no geometrical restrictions. This was achieved by 2002 and complete free-energy profiles and surfaces with full sampling of solute and solvent coordinates can now be obtained through one job submission using BOSS [JCC 2005, 26, 1689]. Speed and accuracy demands also led to development of the improved semiempirical QM method, PDDG-PM3 [JCC 1601 (2002); JCTC 817 (2005)]. The combined PDDG-PM3/MC/FEP methodology has provided excellent results for free energies of activation for many reactions in numerous solvents. Recent examples include Cope, Kemp and E1cb eliminations [JACS 8829 (2005), 6141 (2006); JOC 4896 (2006)], as well as enzymatic reactions catalyzed by the putative Diels-Alderase, macrophomate synthase, and fatty-acid amide hydrolase [JACS 3577 (2005); JACS (2006)]. The presentation will focus on the accuracy that is currently achievable in such QM/MM studies and the accuracy of the underlying QM methodology including extensive comparisons of results from PDDG-PM3 and ab initio DFT methods.

  10. oPOSSUM: identification of over-represented transcription factor binding sites in co-expressed genes

    PubMed Central

    Ho Sui, Shannan J.; Mortimer, James R.; Arenillas, David J.; Brumm, Jochen; Walsh, Christopher J.; Kennedy, Brian P.; Wasserman, Wyeth W.

    2005-01-01

    Targeted transcript profiling studies can identify sets of co-expressed genes; however, identification of the underlying functional mechanism(s) is a significant challenge. Established methods for the analysis of gene annotations, particularly those based on the Gene Ontology, can identify functional linkages between genes. Similar methods for the identification of over-represented transcription factor binding sites (TFBSs) have been successful in yeast, but extension to human genomics has largely proved ineffective. Creation of a system for the efficient identification of common regulatory mechanisms in a subset of co-expressed human genes promises to break a roadblock in functional genomics research. We have developed an integrated system that searches for evidence of co-regulation by one or more transcription factors (TFs). oPOSSUM combines a pre-computed database of conserved TFBSs in human and mouse promoters with statistical methods for identification of sites over-represented in a set of co-expressed genes. The algorithm successfully identified mediating TFs in control sets of tissue-specific genes and in sets of co-expressed genes from three transcript profiling studies. Simulation studies indicate that oPOSSUM produces few false positives using empirically defined thresholds and can tolerate up to 50% noise in a set of co-expressed genes. PMID:15933209

  11. Quantum Mechanics From the Cradle?

    ERIC Educational Resources Information Center

    Martin, John L.

    1974-01-01

    States that the major problem in learning quantum mechanics is often the student's ignorance of classical mechanics and that one conceptual hurdle in quantum mechanics is its statistical nature, in contrast to the determinism of classical mechanics. (MLH)

  12. Comparison of Artificial Neural Networks and ARIMA statistical models in simulations of target wind time series

    NASA Astrophysics Data System (ADS)

    Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas

    2015-04-01

    The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.

  13. The change and development of statistical methods used in research articles in child development 1930-2010.

    PubMed

    Køppe, Simo; Dammeyer, Jesper

    2014-09-01

    The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.

  14. Outcomes of planetary close encounters - A systematic comparison of methodologies

    NASA Technical Reports Server (NTRS)

    Greenberg, Richard; Carusi, Andrea; Valsecchi, G. B.

    1988-01-01

    Several methods for estimating the outcomes of close planetary encounters are compared on the basis of the numerical integration of a range of encounter types. An attempt is made to lay the foundation for the development of predictive rules concerning the encounter outcomes applicable to the refinement of the statistical mechanics that apply to planet-formation and similar problems concerning planetary swarms. Attention is given to Oepik's (1976) formulation of the two-body approximation, whose predicted motion differs from the correct three-body behavior.

  15. Statistical Mechanics and Dynamics of the Outer Solar System.I. The Jupiter/Saturn Zone

    NASA Technical Reports Server (NTRS)

    Grazier, K. R.; Newman, W. I.; Kaula, W. M.; Hyman, J. M.

    1996-01-01

    We report on numerical simulations designed to understand how the solar system evolved through a winnowing of planetesimals accreeted from the early solar nebula. This sorting process is driven by the energy and angular momentum and continues to the present day. We reconsider the existence and importance of stable niches in the Jupiter/Saturn Zone using greatly improved numerical techniques based on high-order optimized multi-step integration schemes coupled to roundoff error minimizing methods.

  16. [The occupational aspect of sudden cardiac death in coal miners].

    PubMed

    Cherkesov, V V; Kobets, G P; Kopytina, R A; Kamkov, V P; Fufaeva, I G; Danilik, V M; Sizonenko, L N; Tsygankov, V A

    1993-09-01

    By means of epidemiological, clinico-functional, experimental, pathomorphological, histological and mathematical-statistical methods the authors showed that hard physical work under conditions of heating microclimate promoted quick development and advance of coronary heart disease in deeply working coal miners. Negative dynamics of sudden coronary death (SCD) rate was established, its pathophysiological mechanisms were specified. SCD risk factors were singled out and arranged accordingly to their importance. SCD in miners was suggested to be considered as professionally conditioned state.

  17. Impact of Injury Mechanisms on Patterns and Management of Facial Fractures.

    PubMed

    Greathouse, S Travis; Adkinson, Joshua M; Garza, Ramon; Gilstrap, Jarom; Miller, Nathan F; Eid, Sherrine M; Murphy, Robert X

    2015-07-01

    Mechanisms causing facial fractures have evolved over time and may be predictive of the types of injuries sustained. The objective of this study is to examine the impact of mechanisms of injury on the type and management of facial fractures at our Level 1 Trauma Center. The authors performed an Institutional Review Board-approved review of our network's trauma registry from 2006 to 2010, documenting age, sex, mechanism, Injury Severity Score, Glasgow Coma Scale, facial fracture patterns (nasal, maxillary/malar, orbital, mandible), and reconstructions. Mechanism rates were compared using a Pearson χ2 test. The database identified 23,318 patients, including 1686 patients with facial fractures and a subset of 1505 patients sustaining 2094 fractures by motor vehicle collision (MVC), fall, or assault. Nasal fractures were the most common injuries sustained by all mechanisms. MVCs were most likely to cause nasal and malar/maxillary fractures (P < 0.01). Falls were the least likely and assaults the most likely to cause mandible fractures (P < 0.001), the most common injury leading to surgical intervention (P < 0.001). Although not statistically significant, fractures sustained in MVCs were the most likely overall to undergo surgical intervention. Age, number of fractures, and alcohol level were statistically significant variables associated with operative management. Age and number of fractures sustained were associated with operative intervention. Although there is a statistically significant correlation between mechanism of injury and type of facial fracture sustained, none of the mechanisms evaluated herein are statistically associated with surgical intervention. Clinical Question/Level of Evidence: Therapeutic, III.

  18. Jets and Metastability in Quantum Mechanics and Quantum Field Theory

    NASA Astrophysics Data System (ADS)

    Farhi, David

    I give a high level overview of the state of particle physics in the introduction, accessible without any background in the field. I discuss improvements of theoretical and statistical methods used for collider physics. These include telescoping jets, a statistical method which was claimed to allow jet searches to increase their sensitivity by considering several interpretations of each event. We find that indeed multiple interpretations extend the power of searches, for both simple counting experiments and powerful multivariate fitting experiments, at least for h → bb¯ at the LHC. Then I propose a method for automation of background calculations using SCET by appropriating the technology of Monte Carlo generators such as MadGraph. In the third chapter I change gears and discuss the future of the universe. It has long been known that our pocket of the standard model is unstable; there is a lower-energy configuration in a remote part of the configuration space, to which our universe will, eventually, decay. While the timescales involved are on the order of 10400 years (depending on how exactly one counts) and thus of no immediate worry, I discuss the shortcomings of the standard methods and propose a more physically motivated derivation for the decay rate. I then make various observations about the structure of decays in quantum field theory.

  19. Sequential Monte Carlo tracking of the marginal artery by multiple cue fusion and random forest regression.

    PubMed

    Cherry, Kevin M; Peplinski, Brandon; Kim, Lauren; Wang, Shijun; Lu, Le; Zhang, Weidong; Liu, Jianfei; Wei, Zhuoshi; Summers, Ronald M

    2015-01-01

    Given the potential importance of marginal artery localization in automated registration in computed tomography colonography (CTC), we have devised a semi-automated method of marginal vessel detection employing sequential Monte Carlo tracking (also known as particle filtering tracking) by multiple cue fusion based on intensity, vesselness, organ detection, and minimum spanning tree information for poorly enhanced vessel segments. We then employed a random forest algorithm for intelligent cue fusion and decision making which achieved high sensitivity and robustness. After applying a vessel pruning procedure to the tracking results, we achieved statistically significantly improved precision compared to a baseline Hessian detection method (2.7% versus 75.2%, p<0.001). This method also showed statistically significantly improved recall rate compared to a 2-cue baseline method using fewer vessel cues (30.7% versus 67.7%, p<0.001). These results demonstrate that marginal artery localization on CTC is feasible by combining a discriminative classifier (i.e., random forest) with a sequential Monte Carlo tracking mechanism. In so doing, we present the effective application of an anatomical probability map to vessel pruning as well as a supplementary spatial coordinate system for colonic segmentation and registration when this task has been confounded by colon lumen collapse. Published by Elsevier B.V.

  20. Ultrasonic evaluation of the physical and mechanical properties of granites.

    PubMed

    Vasconcelos, G; Lourenço, P B; Alves, C A S; Pamplona, J

    2008-09-01

    Masonry is the oldest building material that survived until today, being used all over the world and being present in the most impressive historical structures as an evidence of spirit of enterprise of ancient cultures. Conservation, rehabilitation and strengthening of the built heritage and protection of human lives are clear demands of modern societies. In this process, the use of nondestructive methods has become much common in the diagnosis of structural integrity of masonry elements. With respect to the evaluation of the stone condition, the ultrasonic pulse velocity is a simple and economical tool. Thus, the central issue of the present paper concerns the evaluation of the suitability of the ultrasonic pulse velocity method for describing the mechanical and physical properties of granites (range size between 0.1-4.0 mm and 0.3-16.5 mm) and for the assessment of its weathering state. The mechanical properties encompass the compressive and tensile strength and modulus of elasticity, and the physical properties include the density and porosity. For this purpose, measurements of the longitudinal ultrasonic pulse velocity with distinct natural frequency of the transducers were carried out on specimens with different size and shape. A discussion of the factors that induce variations on the ultrasonic velocity is also provided. Additionally, statistical correlations between ultrasonic pulse velocity and mechanical and physical properties of granites are presented and discussed. The major output of the work is the confirmation that ultrasonic pulse velocity can be effectively used as a simple and economical nondestructive method for a preliminary prediction of mechanical and physical properties, as well as a tool for the assessment of the weathering changes of granites that occur during the serviceable life. This is of much interest due to the usual difficulties in removing specimens for mechanical characterization.

  1. Effect of Resin-modified Glass Ionomer Cement Dispensing/Mixing Methods on Mechanical Properties.

    PubMed

    Sulaiman, T A; Abdulmajeed, A A; Altitinchi, A; Ahmed, S N; Donovan, T E

    2018-03-23

    Resin-modified glass ionomer cements (RMGIs) are often used for luting indirect restorations. Hand-mixing traditional cements demands significant time and may be technique sensitive. Efforts have been made by manufacturers to introduce the same cement using different dispensing/mixing methods. It is not known what effects these changes may have on the mechanical properties of the dental cement. The purpose of this study was to evaluate the mechanical properties (diametral tensile strength [DTS], compressive strength [CS], and fracture toughness [FT]) of RMGIs with different dispensing/mixing systems. The RMGI specimens (n=14)-RelyX Luting (hand mix), RelyX Luting Plus (clicker-hand mix), RelyX Luting Plus (automix) (3M ESPE), GC Fuji PLUS (capsule-automix), and GC FujiCEM 2 (automix) (GC)-were prepared for each mechanical test and examined after thermocycling (n=7/subgroup) for 20,000 cycles to the following: DTS, CS (ISO 9917-1) and FT (ISO standard 6872; Single-edge V-notched beam method). Specimens were mounted and loaded with a universal testing machine until failure occurred. Two-/one-way analysis of variance followed by Tukey honestly significantly different post hoc test was used to analyze data for statistical significance ( p<0.05). The interaction effect of both dispensing/mixing method and thermocycling was significant only for the CS test of the GC group ( p<0.05). The different dispensing/mixing methods had no effect on the DTS of the tested cements. The CS of GC Fuji PLUS was significantly higher than that of the automix version ( p<0.05). The FT decreased significantly when switching from RelyX (hand mix) to RelyX Luting Plus (clicker-hand mix) and to RelyX Luting Plus (automix) ( p<0.05). Except in the case of the DTS of the GC group and the CS of GC Fuji PLUS, thermocycling had a significant effect reducing the mechanical properties of the RMGI cements ( p<0.05). Introducing alternative dispensing/mixing methods for mixing RMGIs to reduce time and technique sensitivity may affect mechanical properties and is brand dependent.

  2. [Review of research design and statistical methods in Chinese Journal of Cardiology].

    PubMed

    Zhang, Li-jun; Yu, Jin-ming

    2009-07-01

    To evaluate the research design and the use of statistical methods in Chinese Journal of Cardiology. Peer through the research design and statistical methods in all of the original papers in Chinese Journal of Cardiology from December 2007 to November 2008. The most frequently used research designs are cross-sectional design (34%), prospective design (21%) and experimental design (25%). In all of the articles, 49 (25%) use wrong statistical methods, 29 (15%) lack some sort of statistic analysis, 23 (12%) have inconsistencies in description of methods. There are significant differences between different statistical methods (P < 0.001). The correction rates of multifactor analysis were low and repeated measurement datas were not used repeated measurement analysis. Many problems exist in Chinese Journal of Cardiology. Better research design and correct use of statistical methods are still needed. More strict review by statistician and epidemiologist is also required to improve the literature qualities.

  3. BEAT: Bioinformatics Exon Array Tool to store, analyze and visualize Affymetrix GeneChip Human Exon Array data from disease experiments

    PubMed Central

    2012-01-01

    Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968

  4. Harnessing the complexity of gene expression data from cancer: from single gene to structural pathway methods

    PubMed Central

    2012-01-01

    High-dimensional gene expression data provide a rich source of information because they capture the expression level of genes in dynamic states that reflect the biological functioning of a cell. For this reason, such data are suitable to reveal systems related properties inside a cell, e.g., in order to elucidate molecular mechanisms of complex diseases like breast or prostate cancer. However, this is not only strongly dependent on the sample size and the correlation structure of a data set, but also on the statistical hypotheses tested. Many different approaches have been developed over the years to analyze gene expression data to (I) identify changes in single genes, (II) identify changes in gene sets or pathways, and (III) identify changes in the correlation structure in pathways. In this paper, we review statistical methods for all three types of approaches, including subtypes, in the context of cancer data and provide links to software implementations and tools and address also the general problem of multiple hypotheses testing. Further, we provide recommendations for the selection of such analysis methods. Reviewers This article was reviewed by Arcady Mushegian, Byung-Soo Kim and Joel Bader. PMID:23227854

  5. Analysis of alterations in white matter integrity of adult patients with comitant exotropia.

    PubMed

    Li, Dan; Li, Shenghong; Zeng, Xianjun

    2018-05-01

    Objective This study was performed to investigate structural abnormalities of the white matter in patients with comitant exotropia using the tract-based spatial statistics (TBSS) method. Methods Diffusion tensor imaging data from magnetic resonance images of the brain were collected from 20 patients with comitant exotropia and 20 age- and sex-matched healthy controls. The FMRIB Software Library was used to compute the diffusion measures, including fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD), and radial diffusivity (RD). These measures were obtained using voxel-wise statistics with threshold-free cluster enhancement. Results The FA values in the right inferior fronto-occipital fasciculus (IFO) and right inferior longitudinal fasciculus were significantly higher and the RD values in the bilateral IFO, forceps minor, left anterior corona radiata, and left anterior thalamic radiation were significantly lower in the comitant exotropia group than in the healthy controls. No significant differences in the MD or AD values were found between the two groups. Conclusions Alterations in FA and RD values may indicate the underlying neuropathologic mechanism of comitant exotropia. The TBSS method can be a useful tool to investigate neuronal tract participation in patients with this disease.

  6. Efficient exploration of pan-cancer networks by generalized covariance selection and interactive web content

    PubMed Central

    Kling, Teresia; Johansson, Patrik; Sanchez, José; Marinescu, Voichita D.; Jörnsten, Rebecka; Nelander, Sven

    2015-01-01

    Statistical network modeling techniques are increasingly important tools to analyze cancer genomics data. However, current tools and resources are not designed to work across multiple diagnoses and technical platforms, thus limiting their applicability to comprehensive pan-cancer datasets such as The Cancer Genome Atlas (TCGA). To address this, we describe a new data driven modeling method, based on generalized Sparse Inverse Covariance Selection (SICS). The method integrates genetic, epigenetic and transcriptional data from multiple cancers, to define links that are present in multiple cancers, a subset of cancers, or a single cancer. It is shown to be statistically robust and effective at detecting direct pathway links in data from TCGA. To facilitate interpretation of the results, we introduce a publicly accessible tool (cancerlandscapes.org), in which the derived networks are explored as interactive web content, linked to several pathway and pharmacological databases. To evaluate the performance of the method, we constructed a model for eight TCGA cancers, using data from 3900 patients. The model rediscovered known mechanisms and contained interesting predictions. Possible applications include prediction of regulatory relationships, comparison of network modules across multiple forms of cancer and identification of drug targets. PMID:25953855

  7. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  8. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE PAGES

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...

    2016-03-09

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  9. Debating Curricular Strategies for Teaching Statistics and Research Methods: What Does the Current Evidence Suggest?

    ERIC Educational Resources Information Center

    Barron, Kenneth E.; Apple, Kevin J.

    2014-01-01

    Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…

  10. Development of a Research Methods and Statistics Concept Inventory

    ERIC Educational Resources Information Center

    Veilleux, Jennifer C.; Chapman, Kate M.

    2017-01-01

    Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…

  11. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    PubMed

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  12. Turking Statistics: Student-Generated Surveys Increase Student Engagement and Performance

    ERIC Educational Resources Information Center

    Whitley, Cameron T.; Dietz, Thomas

    2018-01-01

    Thirty years ago, Hubert M. Blalock Jr. published an article in "Teaching Sociology" about the importance of teaching statistics. We honor Blalock's legacy by assessing how using Amazon Mechanical Turk (MTurk) in statistics classes can enhance student learning and increase statistical literacy among social science gradaute students. In…

  13. Determining Functional Reliability of Pyrotechnic Mechanical Devices

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Multhaup, Herbert A.

    1997-01-01

    This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.

  14. Characterization and identification of ubiquitin conjugation sites with E3 ligase recognition specificities.

    PubMed

    Nguyen, Van-Nui; Huang, Kai-Yao; Huang, Chien-Hsun; Chang, Tzu-Hao; Bretaña, Neil; Lai, K; Weng, Julia; Lee, Tzong-Yi

    2015-01-01

    In eukaryotes, ubiquitin-conjugation is an important mechanism underlying proteasome-mediated degradation of proteins, and as such, plays an essential role in the regulation of many cellular processes. In the ubiquitin-proteasome pathway, E3 ligases play important roles by recognizing a specific protein substrate and catalyzing the attachment of ubiquitin to a lysine (K) residue. As more and more experimental data on ubiquitin conjugation sites become available, it becomes possible to develop prediction models that can be scaled to big data. However, no development that focuses on the investigation of ubiquitinated substrate specificities has existed. Herein, we present an approach that exploits an iteratively statistical method to identify ubiquitin conjugation sites with substrate site specificities. In this investigation, totally 6259 experimentally validated ubiquitinated proteins were obtained from dbPTM. After having filtered out homologous fragments with 40% sequence identity, the training data set contained 2658 ubiquitination sites (positive data) and 5532 non-ubiquitinated sites (negative data). Due to the difficulty in characterizing the substrate site specificities of E3 ligases by conventional sequence logo analysis, a recursively statistical method has been applied to obtain significant conserved motifs. The profile hidden Markov model (profile HMM) was adopted to construct the predictive models learned from the identified substrate motifs. A five-fold cross validation was then used to evaluate the predictive model, achieving sensitivity, specificity, and accuracy of 73.07%, 65.46%, and 67.93%, respectively. Additionally, an independent testing set, completely blind to the training data of the predictive model, was used to demonstrate that the proposed method could provide a promising accuracy (76.13%) and outperform other ubiquitination site prediction tool. A case study demonstrated the effectiveness of the characterized substrate motifs for identifying ubiquitination sites. The proposed method presents a practical means of preliminary analysis and greatly diminishes the total number of potential targets required for further experimental confirmation. This method may help unravel their mechanisms and roles in E3 recognition and ubiquitin-mediated protein degradation.

  15. CAMERRA: An analysis tool for the computation of conformational dynamics by evaluating residue-residue associations.

    PubMed

    Johnson, Quentin R; Lindsay, Richard J; Shen, Tongye

    2018-02-21

    A computational method which extracts the dominant motions from an ensemble of biomolecular conformations via a correlation analysis of residue-residue contacts is presented. The algorithm first renders the structural information into contact matrices, then constructs the collective modes based on the correlated dynamics of a selected set of dynamic contacts. Associated programs can bridge the results for further visualization using graphics software. The aim of this method is to provide an analysis of conformations of biopolymers from the contact viewpoint. It may assist a systematical uncovering of conformational switching mechanisms existing in proteins and biopolymer systems in general by statistical analysis of simulation snapshots. In contrast to conventional correlation analyses of Cartesian coordinates (such as distance covariance analysis and Cartesian principal component analysis), this program also provides an alternative way to locate essential collective motions in general. Herein, we detail the algorithm in a stepwise manner and comment on the importance of the method as applied to decoding allosteric mechanisms. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  16. Introducing 3D U-statistic method for separating anomaly from background in exploration geochemical data with associated software development

    NASA Astrophysics Data System (ADS)

    Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir

    2016-03-01

    The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.

  17. Quantum mechanical fragment methods based on partitioning atoms or partitioning coordinates.

    PubMed

    Wang, Bo; Yang, Ke R; Xu, Xuefei; Isegawa, Miho; Leverentz, Hannah R; Truhlar, Donald G

    2014-09-16

    Conspectus The development of more efficient and more accurate ways to represent reactive potential energy surfaces is a requirement for extending the simulation of large systems to more complex systems, longer-time dynamical processes, and more complete statistical mechanical sampling. One way to treat large systems is by direct dynamics fragment methods. Another way is by fitting system-specific analytic potential energy functions with methods adapted to large systems. Here we consider both approaches. First we consider three fragment methods that allow a given monomer to appear in more than one fragment. The first two approaches are the electrostatically embedded many-body (EE-MB) expansion and the electrostatically embedded many-body expansion of the correlation energy (EE-MB-CE), which we have shown to yield quite accurate results even when one restricts the calculations to include only electrostatically embedded dimers. The third fragment method is the electrostatically embedded molecular tailoring approach (EE-MTA), which is more flexible than EE-MB and EE-MB-CE. We show that electrostatic embedding greatly improves the accuracy of these approaches compared with the original unembedded approaches. Quantum mechanical fragment methods share with combined quantum mechanical/molecular mechanical (QM/MM) methods the need to treat a quantum mechanical fragment in the presence of the rest of the system, which is especially challenging for those parts of the rest of the system that are close to the boundary of the quantum mechanical fragment. This is a delicate matter even for fragments that are not covalently bonded to the rest of the system, but it becomes even more difficult when the boundary of the quantum mechanical fragment cuts a bond. We have developed a suite of methods for more realistically treating interactions across such boundaries. These methods include redistributing and balancing the external partial atomic charges and the use of tuned fluorine atoms for capping dangling bonds, and we have shown that they can greatly improve the accuracy. Finally we present a new approach that goes beyond QM/MM by combining the convenience of molecular mechanics with the accuracy of fitting a potential function to electronic structure calculations on a specific system. To make the latter practical for systems with a large number of degrees of freedom, we developed a method to interpolate between local internal-coordinate fits to the potential energy. A key issue for the application to large systems is that rather than assigning the atoms or monomers to fragments, we assign the internal coordinates to reaction, secondary, and tertiary sets. Thus, we make a partition in coordinate space rather than atom space. Fits to the local dependence of the potential energy on tertiary coordinates are arrayed along a preselected reaction coordinate at a sequence of geometries called anchor points; the potential energy function is called an anchor points reactive potential. Electrostatically embedded fragment methods and the anchor points reactive potential, because they are based on treating an entire system by quantum mechanical electronic structure methods but are affordable for large and complex systems, have the potential to open new areas for accurate simulations where combined QM/MM methods are inadequate.

  18. A comparison of synchronized intermittent mandatory ventilation and pressure-regulated volume control ventilation in elderly patients with acute exacerbations of COPD and respiratory failure

    PubMed Central

    Chang, Suchi; Shi, Jindong; Fu, Cuiping; Wu, Xu; Li, Shanqun

    2016-01-01

    Background COPD is the third leading cause of death worldwide. Acute exacerbations of COPD may cause respiratory failure, requiring intensive care unit admission and mechanical ventilation. Intensive care unit patients with acute exacerbations of COPD requiring mechanical ventilation have higher mortality rates than other hospitalized patients. Although mechanical ventilation is the most effective intervention for these conditions, invasive ventilation techniques have yielded variable effects. Objective We evaluated pressure-regulated volume control (PRVC) ventilation treatment efficacy and preventive effects on pulmonary barotrauma in elderly COPD patients with respiratory failure. Patients and methods Thirty-nine intubated patients were divided into experimental and control groups and treated with the PRVC and synchronized intermittent mandatory ventilation – volume control methods, respectively. Vital signs, respiratory mechanics, and arterial blood gas analyses were monitored for 2–4 hours and 48 hours. Results Both groups showed rapidly improved pH, partial pressure of oxygen (PaO2), and PaO2 per fraction of inspired O2 levels and lower partial pressure of carbon dioxide (PaCO2) levels. The pH and PaCO2 levels at 2–4 hours were lower and higher, respectively, in the test group than those in the control group (P<0.05 for both); after 48 hours, blood gas analyses showed no statistical difference in any marker (P>0.05). Vital signs during 2–4 hours and 48 hours of treatment showed no statistical difference in either group (P>0.05). The level of peak inspiratory pressure in the experimental group after mechanical ventilation for 2–4 hours and 48 hours was significantly lower than that in the control group (P<0.05), while other variables were not significantly different between groups (P>0.05). Conclusion Among elderly COPD patients with respiratory failure, application of PRVC resulted in rapid improvement in arterial blood gas analyses while maintaining a low peak inspiratory pressure. PRVC can reduce pulmonary barotrauma risk, making it a safer protective ventilation mode than synchronized intermittent mandatory ventilation – volume control. PMID:27274223

  19. Mechanical Properties Analysis of 4340 Steel Specimen Heat Treated in Oven and Quenching in Three Different Fluids

    NASA Astrophysics Data System (ADS)

    Fakir, Rachid; Barka, Noureddine; Brousseau, Jean

    2018-03-01

    This paper proposes a statistical approach to analyze the mechanical properties of a standard test specimen, of cylindrical geometry and in steel 4340, with a diameter of 6 mm, heat-treated and quenched in three different fluids. Samples were evaluated in standard tensile test to access their characteristic quantities: hardness, modulus of elasticity, yield strength, tensile strength and ultimate deformation. The proposed approach is gradually being built (a) by a presentation of the experimental device, (b) a presentation of the experimental plan and the results of the mechanical tests, (c) anova analysis of variance and a representation of the output responses using the RSM response surface method, and (d) an analysis of the results and discussion. The feasibility and effectiveness of the proposed approach leads to a precise and reliable model capable of predicting the variation of mechanical properties, depending on the tempering temperature, the tempering time and the cooling capacity of the quenching medium.

  20. Molecular representation of molar domain (volume), evolution equations, and linear constitutive relations for volume transport.

    PubMed

    Eu, Byung Chan

    2008-09-07

    In the traditional theories of irreversible thermodynamics and fluid mechanics, the specific volume and molar volume have been interchangeably used for pure fluids, but in this work we show that they should be distinguished from each other and given distinctive statistical mechanical representations. In this paper, we present a general formula for the statistical mechanical representation of molecular domain (volume or space) by using the Voronoi volume and its mean value that may be regarded as molar domain (volume) and also the statistical mechanical representation of volume flux. By using their statistical mechanical formulas, the evolution equations of volume transport are derived from the generalized Boltzmann equation of fluids. Approximate solutions of the evolution equations of volume transport provides kinetic theory formulas for the molecular domain, the constitutive equations for molar domain (volume) and volume flux, and the dissipation of energy associated with volume transport. Together with the constitutive equation for the mean velocity of the fluid obtained in a previous paper, the evolution equations for volume transport not only shed a fresh light on, and insight into, irreversible phenomena in fluids but also can be applied to study fluid flow problems in a manner hitherto unavailable in fluid dynamics and irreversible thermodynamics. Their roles in the generalized hydrodynamics will be considered in the sequel.

  1. Arrays of suspended silicon nanowires defined by ion beam implantation: mechanical coupling and combination with CMOS technology.

    PubMed

    Llobet, J; Rius, G; Chuquitarqui, A; Borrisé, X; Koops, R; van Veghel, M; Perez-Murano, F

    2018-04-02

    We present the fabrication, operation, and CMOS integration of arrays of suspended silicon nanowires (SiNWs). The functional structures are obtained by a top-down fabrication approach consisting in a resistless process based on focused ion beam irradiation, causing local gallium implantation and silicon amorphization, plus selective silicon etching by tetramethylammonium hydroxide, and a thermal annealing process in a boron rich atmosphere. The last step enables the electrical functionality of the irradiated material. Doubly clamped silicon beams are fabricated by this method. The electrical readout of their mechanical response can be addressed by a frequency down-mixing detection technique thanks to an enhanced piezoresistive transduction mechanism. Three specific aspects are discussed: (i) the engineering of mechanically coupled SiNWs, by making use of the nanometer scale overhang that it is inherently-generated with this fabrication process, (ii) the statistical distribution of patterned lateral dimensions when fabricating large arrays of identical devices, and (iii) the compatibility of the patterning methodology with CMOS circuits. Our results suggest that the application of this method to the integration of large arrays of suspended SiNWs with CMOS circuitry is interesting in view of applications such as advanced radio frequency band pass filters and ultra-high-sensitivity mass sensors.

  2. Arrays of suspended silicon nanowires defined by ion beam implantation: mechanical coupling and combination with CMOS technology

    NASA Astrophysics Data System (ADS)

    Llobet, J.; Rius, G.; Chuquitarqui, A.; Borrisé, X.; Koops, R.; van Veghel, M.; Perez-Murano, F.

    2018-04-01

    We present the fabrication, operation, and CMOS integration of arrays of suspended silicon nanowires (SiNWs). The functional structures are obtained by a top-down fabrication approach consisting in a resistless process based on focused ion beam irradiation, causing local gallium implantation and silicon amorphization, plus selective silicon etching by tetramethylammonium hydroxide, and a thermal annealing process in a boron rich atmosphere. The last step enables the electrical functionality of the irradiated material. Doubly clamped silicon beams are fabricated by this method. The electrical readout of their mechanical response can be addressed by a frequency down-mixing detection technique thanks to an enhanced piezoresistive transduction mechanism. Three specific aspects are discussed: (i) the engineering of mechanically coupled SiNWs, by making use of the nanometer scale overhang that it is inherently-generated with this fabrication process, (ii) the statistical distribution of patterned lateral dimensions when fabricating large arrays of identical devices, and (iii) the compatibility of the patterning methodology with CMOS circuits. Our results suggest that the application of this method to the integration of large arrays of suspended SiNWs with CMOS circuitry is interesting in view of applications such as advanced radio frequency band pass filters and ultra-high-sensitivity mass sensors.

  3. Evolution of social versus individual learning in a subdivided population revisited: comparative analysis of three coexistence mechanisms using the inclusive-fitness method.

    PubMed

    Kobayashi, Yutaka; Ohtsuki, Hisashi

    2014-03-01

    Learning abilities are categorized into social (learning from others) and individual learning (learning on one's own). Despite the typically higher cost of individual learning, there are mechanisms that allow stable coexistence of both learning modes in a single population. In this paper, we investigate by means of mathematical modeling how the effect of spatial structure on evolutionary outcomes of pure social and individual learning strategies depends on the mechanisms for coexistence. We model a spatially structured population based on the infinite-island framework and consider three scenarios that differ in coexistence mechanisms. Using the inclusive-fitness method, we derive the equilibrium frequency of social learners and the genetic load of social learning (defined as average fecundity reduction caused by the presence of social learning) in terms of some summary statistics, such as relatedness, for each of the three scenarios and compare the results. This comparative analysis not only reconciles previous models that made contradictory predictions as to the effect of spatial structure on the equilibrium frequency of social learners but also derives a simple mathematical rule that determines the sign of the genetic load (i.e. whether or not social learning contributes to the mean fecundity of the population). Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  5. Singular structure of Mueller matrices images of biological crystal networks for diagnostic human tissues pathological changes

    NASA Astrophysics Data System (ADS)

    Sakhnovskiy, M. Y.; Ushenko, V. A.

    2013-09-01

    The process of converting of laser radiation by optically anisotropic crystals of biological networks are singular in the sense of total (simultaneous) of mechanisms of orientation and phase (birefringence) anisotropy the formation of polarization-inhomogeneous field of scattered radiation. This work is aimed at developing a method of polarization selection mechanisms of blood plasma polycrystalline networks anisotropy. The relationship between statistics, correlation and fractal parameters of polarization-inhomogeneous images of blood plasma and by linear dichroism and linear birefringence of polycrystalline networks albumin and globulin was found. The criteria of differentiation and diagnostic images of polarization-inhomogeneous plasma samples of the control group (donor) and a group of patients with malignant changes of breast tissue was identified.

  6. Torsion of DNA modeled as a heterogeneous fluctuating rod

    NASA Astrophysics Data System (ADS)

    Argudo, David; Purohit, Prashant K.

    2014-01-01

    We discuss the statistical mechanics of a heterogeneous elastic rod with bending, twisting and stretching. Our model goes beyond earlier works where only homogeneous rods were considered in the limit of high forces and long lengths. Our methods allow us to consider shorter fluctuating rods for which boundary conditions can play an important role. We use our theory to study structural transitions in torsionally constrained DNA where there is coexistence of states with different effective properties. In particular, we examine whether a newly discovered left-handed DNA conformation called L-DNA is a mixture of two known states. We also use our model to investigate the mechanical effects of the binding of small molecules to DNA. For both these applications we make experimentally falsifiable predictions.

  7. Inverse probability weighting and doubly robust methods in correcting the effects of non-response in the reimbursed medication and self-reported turnout estimates in the ATH survey.

    PubMed

    Härkänen, Tommi; Kaikkonen, Risto; Virtala, Esa; Koskinen, Seppo

    2014-11-06

    To assess the nonresponse rates in a questionnaire survey with respect to administrative register data, and to correct the bias statistically. The Finnish Regional Health and Well-being Study (ATH) in 2010 was based on a national sample and several regional samples. Missing data analysis was based on socio-demographic register data covering the whole sample. Inverse probability weighting (IPW) and doubly robust (DR) methods were estimated using the logistic regression model, which was selected using the Bayesian information criteria. The crude, weighted and true self-reported turnout in the 2008 municipal election and prevalences of entitlements to specially reimbursed medication, and the crude and weighted body mass index (BMI) means were compared. The IPW method appeared to remove a relatively large proportion of the bias compared to the crude prevalence estimates of the turnout and the entitlements to specially reimbursed medication. Several demographic factors were shown to be associated with missing data, but few interactions were found. Our results suggest that the IPW method can improve the accuracy of results of a population survey, and the model selection provides insight into the structure of missing data. However, health-related missing data mechanisms are beyond the scope of statistical methods, which mainly rely on socio-demographic information to correct the results.

  8. Neurophysiological Markers of Statistical Learning in Music and Language: Hierarchy, Entropy, and Uncertainty.

    PubMed

    Daikoku, Tatsuya

    2018-06-19

    Statistical learning (SL) is a method of learning based on the transitional probabilities embedded in sequential phenomena such as music and language. It has been considered an implicit and domain-general mechanism that is innate in the human brain and that functions independently of intention to learn and awareness of what has been learned. SL is an interdisciplinary notion that incorporates information technology, artificial intelligence, musicology, and linguistics, as well as psychology and neuroscience. A body of recent study has suggested that SL can be reflected in neurophysiological responses based on the framework of information theory. This paper reviews a range of work on SL in adults and children that suggests overlapping and independent neural correlations in music and language, and that indicates disability of SL. Furthermore, this article discusses the relationships between the order of transitional probabilities (TPs) (i.e., hierarchy of local statistics) and entropy (i.e., global statistics) regarding SL strategies in human's brains; claims importance of information-theoretical approaches to understand domain-general, higher-order, and global SL covering both real-world music and language; and proposes promising approaches for the application of therapy and pedagogy from various perspectives of psychology, neuroscience, computational studies, musicology, and linguistics.

  9. Interpreting support vector machine models for multivariate group wise analysis in neuroimaging

    PubMed Central

    Gaonkar, Bilwaj; Shinohara, Russell T; Davatzikos, Christos

    2015-01-01

    Machine learning based classification algorithms like support vector machines (SVMs) have shown great promise for turning a high dimensional neuroimaging data into clinically useful decision criteria. However, tracing imaging based patterns that contribute significantly to classifier decisions remains an open problem. This is an issue of critical importance in imaging studies seeking to determine which anatomical or physiological imaging features contribute to the classifier’s decision, thereby allowing users to critically evaluate the findings of such machine learning methods and to understand disease mechanisms. The majority of published work addresses the question of statistical inference for support vector classification using permutation tests based on SVM weight vectors. Such permutation testing ignores the SVM margin, which is critical in SVM theory. In this work we emphasize the use of a statistic that explicitly accounts for the SVM margin and show that the null distributions associated with this statistic are asymptotically normal. Further, our experiments show that this statistic is a lot less conservative as compared to weight based permutation tests and yet specific enough to tease out multivariate patterns in the data. Thus, we can better understand the multivariate patterns that the SVM uses for neuroimaging based classification. PMID:26210913

  10. Statistical variances of diffusional properties from ab initio molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei

    2018-12-01

    Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.

  11. Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koprinkov, I. G.

    2010-11-25

    The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.

  12. Integrating Statistical Mechanics with Experimental Data from the Rotational-Vibrational Spectrum of HCl into the Physical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Findley, Bret R.; Mylon, Steven E.

    2008-01-01

    We introduce a computer exercise that bridges spectroscopy and thermodynamics using statistical mechanics and the experimental data taken from the commonly used laboratory exercise involving the rotational-vibrational spectrum of HCl. Based on the results from the analysis of their HCl spectrum, students calculate bulk thermodynamic properties…

  13. Non-equilibrium dog-flea model

    NASA Astrophysics Data System (ADS)

    Ackerson, Bruce J.

    2017-11-01

    We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.

  14. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems

    NASA Astrophysics Data System (ADS)

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  15. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.

    PubMed

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  16. Domain generality vs. modality specificity: The paradox of statistical learning

    PubMed Central

    Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.

    2015-01-01

    Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249

  17. Evaluation of the mechanical properties and surface topography of as-received, immersed and as-retrieved orthodontic archwires

    PubMed Central

    POP, SILVIA IZABELLA; DUDESCU, MIRCEA; MERIE, VIOLETA VALENTINA; PACURAR, MARIANA; BRATU, CRISTINA DANA

    2017-01-01

    Background and aims This experimental study mainly aims at comparing the most important mechanical properties of the new orthodontic archwires, those immersed in fluorinated solution, the as-retrieved ones and the intra-oral used ones. Methods A total of 270 arch wires were tested, using tensile testing and three-point bending tests. The tested archwires were made of Stainless Steel, Nickel Titanium, Beta-Titanium and physiognomic covered Nickel Titanium. The tested archwires were subjected to three types of treatments: immersion into fluorinated solution, immersion into carbonated drinks and intra-oral use. Results The immersion caused variations of the activation and deactivation forces of all arch wires. The most affected arch wires, in terms of bending characteristics, were the intra-oral used ones. Conclusions The alteration of mechanical properties of the orthodontic arch wires by their immersion into fluorinated solutions and soft drinks could not be statistically demonstrated. PMID:28781528

  18. Mode-selective control of thermal Brownian vibration of micro-resonator (Generation of a thermal no-equilibrium state by mechanical feedback control)

    NASA Astrophysics Data System (ADS)

    Kawamura, Y.; Kanegae, R.

    2017-09-01

    Recently, there have been various attempts to dampen the vibration amplitude of the Brownian motion of a microresonator below the thermal vibration amplitude, with the goal of reaching the quantum ground vibration level. To further develop the approach of reaching the quantum ground state, it is essential to clarify whether or not coupling exists between the different vibration modes of the resonator. In this paper, the mode-selective control of thermal Brownian vibration is shown. The first and the second vibration modes of a micro-cantilever moved by a random Brownian motion are cooled selectively and independently below the thermal vibration amplitude, as determined by the statistical thermodynamic theory, using a mechanical feedback control method. This experimental result shows that the thermal no-equilibrium condition was generated by mechanical feedback control.

  19. A comparative study to evaluate the effects of ligation methods on friction in sliding mechanics using 0.022" slot brackets in dry state: An In-vitro study

    PubMed Central

    Vinay, K; Venkatesh, M J; Nayak, Rabindra S; Pasha, Azam; Rajesh, M; Kumar, Pradeep

    2014-01-01

    Background: Friction between archwires and brackets is assuming greater importance for finishing with increased use of sliding mechanics in orthodontics as friction impedes the desired tooth movement. The following study is conducted to compare and evaluate the effect of ligation on friction in sliding mechanics using 0.022" slot bracket in dry condition. Materials & Methods: In the study 48 combinations of brackets, archwires and different ligation techniques were tested in order to provide best combination that offers less friction during sliding mechanics. Instron- 4467 machine was used to evaluate static and kinetic friction force values and the results were subjected to Statistical Analysis and Anova test. Results: The results of the study showed that 0.022" metal brackets, Stainless steel wires and Slick modules provided the optimum frictional resistance to sliding mechanics. It is observed that frictional forces of 0.019" x 0.025" were higher when compared with 0.016" x 0.022" Stainless steel archwire due to the increase in dimension. Self-ligating brackets offered least friction followed by mini twin, variable force, regular stainless steel, ceramic with metal insert bracket and ceramic brackets. The stainless steel ligature offered less resistance than slick and grey modules, and TMA wires recorded maximum friction. Conclusion: The stainless steel archwire of 0.019" x 0.025" dimension are preferred during sliding mechanics, these archwires with variable force brackets ligated with Slick Modules offer decreased friction and is cost effective combination which can be utilized during sliding mechanics. How to cite the article: Vinay K, Venkatesh MJ, Nayak RS, Pasha A, Rajesh M, Kumar P. A comparative study to evaluate the effects of ligation methods on friction in sliding mechanics using 0.022" slot brackets in dry state: An In-vitro study. J Int Oral Health 2014;6(2):76-83. PMID:24876706

  20. Evaluation of Non-Ozone-Depleting-Chemical Cleaning Methods for Space Mechanisms Using a Vacuum Spiral Orbit Rolling Contact Tribometer

    NASA Technical Reports Server (NTRS)

    Jansen, Mark J.; Jones, William R., Jr.; Wheeler, Donald R.; Keller, Dennis J.

    2000-01-01

    Because CFC 113, an ozone depleting chemical (ODC), can no longer be produced, alternative bearing cleaning methods must be studied. The objective of this work was to study the effect of the new cleaning methods on lubricant lifetime using a vacuum bearing simulator (spiral orbit rolling contact tribometer). Four alternative cleaning methods were studied: ultra-violet (UV) ozone, aqueous levigated alumina slurry (ALAS), super critical fluid (SCF) CO2 and aqueous Brulin 815GD. Baseline tests were done using CFC 113. Test conditions were the following: a vacuum of at least 1.3 x 10(exp -6) Pa, 440C steel components, a rotational speed of 10 RPM, a lubricant charge of between 60-75 micrograms, a perfluoropolyalkylether lubricant (Z-25), and a load of 200N (44.6 lbs., a mean Hertzian stress of 1.5 GPa). Normalized lubricant lifetime was determined by dividing the total number of ball orbits by the amount of lubricant. The failure condition was a friction coefficient of 0.38. Post-test XPS analysis was also performed, showing slight variations in post-cleaning surface chemistry. Statistical analysis of the resultant data was conducted and it was determined that the data sets were most directly comparable when subjected to a natural log transformation. The natural log life (NL-Life) data for each cleaning method were reasonably normally (statistically) distributed and yielded standard deviations that were not significantly different among the five cleaning methods investigated. This made comparison of their NL-Life means very straightforward using a Bonferroni multiple comparison of means procedure. This procedure showed that the ALAS, UV-ozone and CFC 113 methods were not statistically significantly different from one another with respect to mean NL-Life. It also found that the SCF CO2 method yielded a significantly higher mean NL-Life than the mean NL-Lives of the ALAS, UV-ozone and CFC 113 methods. It also determined that the aqueous Brulin 815GD method yielded a mean NL-Life that was statistically significantly higher than the mean NL-Lives of each of the other four methods. Baseline tests using CFC 113 cleaned parts yielded a mean NL-Life 3.62 orbits/micro-g. ALAS and UV-ozone yielded similar mean NL-Life (3.31 orbits/mg and 3.33 orbits/micro-g, respectively). SCF CO2, gave a mean NL-Life of 4.08 orbits/mg and aqueous Brulin 8l5GD data yielded the longest mean NL-Life (4.66 orbits/micro-g).

Top