Sample records for statistical physics approach

  1. The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach

    NASA Astrophysics Data System (ADS)

    Sari, S. Y.; Afrizon, R.

    2018-04-01

    Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.

  2. An Integrated, Statistical Molecular Approach to the Physical Chemistry Curriculum

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2009-01-01

    As an alternative to the "thermodynamics first" or "quantum first" approaches to the physical chemistry curriculum, the statistical definition of entropy and the Boltzmann distribution are introduced in the first days of the course and the entire two-semester curriculum is then developed from these concepts. Once the tools of statistical mechanics…

  3. Reconstructing Macroeconomics Based on Statistical Physics

    NASA Astrophysics Data System (ADS)

    Aoki, Masanao; Yoshikawa, Hiroshi

    We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.

  4. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    DOE PAGES

    Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...

    2015-02-05

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less

  5. Spatio-temporal analysis of aftershock sequences in terms of Non Extensive Statistical Physics.

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kalliopi; Vallianatos, Filippos

    2017-04-01

    Earth's seismicity is considered as an extremely complicated process where long-range interactions and fracturing exist (Vallianatos et al., 2016). For this reason, in order to analyze it, we use an innovative methodological approach, introduced by Tsallis (Tsallis, 1988; 2009), named Non Extensive Statistical Physics. This approach introduce a generalization of the Boltzmann-Gibbs statistical mechanics and it is based on the definition of Tsallis entropy Sq, which maximized leads the the so-called q-exponential function that expresses the probability distribution function that maximizes the Sq. In the present work, we utilize the concept of Non Extensive Statistical Physics in order to analyze the spatiotemporal properties of several aftershock series. Marekova (Marekova, 2014) suggested that the probability densities of the inter-event distances between successive aftershocks follow a beta distribution. Using the same data set we analyze the inter-event distance distribution of several aftershocks sequences in different geographic regions by calculating non extensive parameters that determine the behavior of the system and by fitting the q-exponential function, which expresses the degree of non-extentivity of the investigated system. Furthermore, the inter-event times distribution of the aftershocks as well as the frequency-magnitude distribution has been analyzed. The results supports the applicability of Non Extensive Statistical Physics ideas in aftershock sequences where a strong correlation exists along with memory effects. References C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. doi:10.1007/BF01016429 C. Tsallis, Introduction to nonextensive statistical mechanics: Approaching a complex world, 2009. doi:10.1007/978-0-387-85359-8. E. Marekova, Analysis of the spatial distribution between successive earthquakes in aftershocks series, Annals of Geophysics, 57, 5, doi:10.4401/ag-6556, 2014 F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016.

  6. Introduction of Entropy via the Boltzmann Distribution in Undergraduate Physical Chemistry: A Molecular Approach

    ERIC Educational Resources Information Center

    Kozliak, Evguenii I.

    2004-01-01

    A molecular approach for introducing entropy in undergraduate physical chemistry course and incorporating the features of Davies' treatment that meets the needs of the students but ignores the complexities of statistics and upgrades the qualitative, intuitive approach of Lambert for general chemistry to a semiquantitative treatment using Boltzmann…

  7. A Statistical-Physics Approach to Language Acquisition and Language Change

    NASA Astrophysics Data System (ADS)

    Cassandro, Marzio; Collet, Pierre; Galves, Antonio; Galves, Charlotte

    1999-02-01

    The aim of this paper is to explain why Statistical Physics can help understanding two related linguistic questions. The first question is how to model first language acquisition by a child. The second question is how language change proceeds in time. Our approach is based on a Gibbsian model for the interface between syntax and prosody. We also present a simulated annealing model of language acquisition, which extends the Triggering Learning Algorithm recently introduced in the linguistic literature.

  8. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2015-04-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project. References F. Vallianatos, "A non-extensive approach to risk assessment", Nat. Hazards Earth Syst. Sci., 9, 211-216, 2009 F. Vallianatos and P. Sammonds "Is plate tectonics a case of non-extensive thermodynamics?" Physica A: Statistical Mechanics and its Applications, 389 (21), 4989-4993, 2010, F. Vallianatos, G. Michas, G. Papadakis and P. Sammonds " A non extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece)", Acta Geophysica, 60(3), 758-768, 2012 F. Vallianatos and L. Telesca, Statistical mechanics in earth physics and natural hazards (editorial), Acta Geophysica, 60, 3, 499-501, 2012 F. Vallianatos, G. Michas, G. Papadakis and A. Tzanis "Evidence of non-extensivity in the seismicity observed during the 2011-2012 unrest at the Santorini volcanic complex, Greece" Nat. Hazards Earth Syst. Sci.,13,177-185, 2013 F. Vallianatos and P. Sammonds, "Evidence of non-extensive statistical physics of the lithospheric instability approaching the 2004 Sumatran-Andaman and 2011 Honshu mega-earthquakes" Tectonophysics, 590 , 52-58, 2013 G. Papadakis, F. Vallianatos, P. Sammonds, " Evidence of Nonextensive Statistical Physics behavior of the Hellenic Subduction Zone seismicity" Tectonophysics, 608, 1037 -1048, 2013 G. Michas, F. Vallianatos, and P. Sammonds, Non-extensivity and long-range correlations in the earthquake activity at the West Corinth rift (Greece) Nonlin. Processes Geophys., 20, 713-724, 2013

  9. Effects of preprocessing Landsat MSS data on derived features

    NASA Technical Reports Server (NTRS)

    Parris, T. M.; Cicone, R. C.

    1983-01-01

    Important to the use of multitemporal Landsat MSS data for earth resources monitoring, such as agricultural inventories, is the ability to minimize the effects of varying atmospheric and satellite viewing conditions, while extracting physically meaningful features from the data. In general, the approaches to the preprocessing problem have been derived from either physical or statistical models. This paper compares three proposed algorithms; XSTAR haze correction, Color Normalization, and Multiple Acquisition Mean Level Adjustment. These techniques represent physical, statistical, and hybrid physical-statistical models, respectively. The comparisons are made in the context of three feature extraction techniques; the Tasseled Cap, the Cate Color Cube. and Normalized Difference.

  10. Surveying Turkish high school and university students' attitudes and approaches to physics problem solving

    NASA Astrophysics Data System (ADS)

    Balta, Nuri; Mason, Andrew J.; Singh, Chandralekha

    2016-06-01

    Students' attitudes and approaches to physics problem solving can impact how well they learn physics and how successful they are in solving physics problems. Prior research in the U.S. using a validated Attitude and Approaches to Problem Solving (AAPS) survey suggests that there are major differences between students in introductory physics and astronomy courses and physics experts in terms of their attitudes and approaches to physics problem solving. Here we discuss the validation, administration, and analysis of data for the Turkish version of the AAPS survey for high school and university students in Turkey. After the validation and administration of the Turkish version of the survey, the analysis of the data was conducted by grouping the data by grade level, school type, and gender. While there are no statistically significant differences between the averages of various groups on the survey, overall, the university students in Turkey were more expertlike than vocational high school students. On an item by item basis, there are statistically differences between the averages of the groups on many items. For example, on average, the university students demonstrated less expertlike attitudes about the role of equations and formulas in problem solving, in solving difficult problems, and in knowing when the solution is not correct, whereas they displayed more expertlike attitudes and approaches on items related to metacognition in physics problem solving. A principal component analysis on the data yields item clusters into which the student responses on various survey items can be grouped. A comparison of the responses of the Turkish and American university students enrolled in algebra-based introductory physics courses shows that on more than half of the items, the responses of these two groups were statistically significantly different, with the U.S. students on average responding to the items in a more expertlike manner.

  11. New approach in the quantum statistical parton distribution

    NASA Astrophysics Data System (ADS)

    Sohaily, Sozha; Vaziri (Khamedi), Mohammad

    2017-12-01

    An attempt to find simple parton distribution functions (PDFs) based on quantum statistical approach is presented. The PDFs described by the statistical model have very interesting physical properties which help to understand the structure of partons. The longitudinal portion of distribution functions are given by applying the maximum entropy principle. An interesting and simple approach to determine the statistical variables exactly without fitting and fixing parameters is surveyed. Analytic expressions of the x-dependent PDFs are obtained in the whole x region [0, 1], and the computed distributions are consistent with the experimental observations. The agreement with experimental data, gives a robust confirm of our simple presented statistical model.

  12. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    PubMed

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  13. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†

    PubMed Central

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-01-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370

  14. Statistical physics of hard combinatorial optimization: Vertex cover problem

    NASA Astrophysics Data System (ADS)

    Zhao, Jin-Hua; Zhou, Hai-Jun

    2014-07-01

    Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.

  15. Nonequilibrium statistical mechanics Brussels-Austin style

    NASA Astrophysics Data System (ADS)

    Bishop, Robert C.

    The fundamental problem on which Ilya Prigogine and the Brussels-Austin Group have focused can be stated briefly as follows. Our observations indicate that there is an arrow of time in our experience of the world (e.g., decay of unstable radioactive atoms like uranium, or the mixing of cream in coffee). Most of the fundamental equations of physics are time reversible, however, presenting an apparent conflict between our theoretical descriptions and experimental observations. Many have thought that the observed arrow of time was either an artifact of our observations or due to very special initial conditions. An alternative approach, followed by the Brussels-Austin Group, is to consider the observed direction of time to be a basic physical phenomenon due to the dynamics of physical systems. This essay focuses mainly on recent developments in the Brussels-Austin Group after the mid-1980s. The fundamental concerns are the same as in their earlier approaches (subdynamics, similarity transformations), but the contemporary approach utilizes rigged Hilbert space (whereas the older approaches used Hilbert space). While the emphasis on nonequilibrium statistical mechanics remains the same, their more recent approach addresses the physical features of large Poincaré systems, nonlinear dynamics and the mathematical tools necessary to analyze them.

  16. A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasqualini, Donatella

    This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimatedmore » stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.« less

  17. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  18. Diffusion-Based Density-Equalizing Maps: an Interdisciplinary Approach to Visualizing Homicide Rates and Other Georeferenced Statistical Data

    NASA Astrophysics Data System (ADS)

    Mazzitello, Karina I.; Candia, Julián

    2012-12-01

    In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.

  19. A statistical physics perspective on alignment-independent protein sequence comparison.

    PubMed

    Chattopadhyay, Amit K; Nasiev, Diar; Flower, Darren R

    2015-08-01

    Within bioinformatics, the textual alignment of amino acid sequences has long dominated the determination of similarity between proteins, with all that implies for shared structure, function and evolutionary descent. Despite the relative success of modern-day sequence alignment algorithms, so-called alignment-free approaches offer a complementary means of determining and expressing similarity, with potential benefits in certain key applications, such as regression analysis of protein structure-function studies, where alignment-base similarity has performed poorly. Here, we offer a fresh, statistical physics-based perspective focusing on the question of alignment-free comparison, in the process adapting results from 'first passage probability distribution' to summarize statistics of ensemble averaged amino acid propensity values. In this article, we introduce and elaborate this approach. © The Author 2015. Published by Oxford University Press.

  20. Learning physics: A comparative analysis between instructional design methods

    NASA Astrophysics Data System (ADS)

    Mathew, Easow

    The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in the control group which may indicate that traditional teaching methods are more effective in females, whereas there was no significant difference noted in the experimental group between male and female participants. There was a statistically significant and negative relationship (r = -.61, p = .01) between age and physics pretest scores in the control group. No statistical analyses yielded significantly different average academic performance values in either group as delineated by ethnicity.

  1. A New Approach to Monte Carlo Simulations in Statistical Physics

    NASA Astrophysics Data System (ADS)

    Landau, David P.

    2002-08-01

    Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).

  2. An Integrated Approach to Thermodynamics in the Introductory Physics Course.

    ERIC Educational Resources Information Center

    Alonso, Marcelo; Finn, Edward J.

    1995-01-01

    Presents an approach to combine the empirical approach of classical thermodynamics with the structural approach of statistical mechanics. Topics covered include dynamical foundation of the first law; mechanical work, heat, radiation, and the first law; thermal equilibrium; thermal processes; thermodynamic probability; entropy; the second law;…

  3. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  4. Treatment of Chemical Equilibrium without Using Thermodynamics or Statistical Mechanics.

    ERIC Educational Resources Information Center

    Nelson, P. G.

    1986-01-01

    Discusses the conventional approaches to teaching about chemical equilibrium in advanced physical chemistry courses. Presents an alternative approach to the treatment of this concept by using Boltzmann's distribution law. Lists five advantages to using this method as compared with the other approaches. (TW)

  5. Quantum Field Theory Approach to Condensed Matter Physics

    NASA Astrophysics Data System (ADS)

    Marino, Eduardo C.

    2017-09-01

    Preface; Part I. Condensed Matter Physics: 1. Independent electrons and static crystals; 2. Vibrating crystals; 3. Interacting electrons; 4. Interactions in action; Part II. Quantum Field Theory: 5. Functional formulation of quantum field theory; 6. Quantum fields in action; 7. Symmetries: explicit or secret; 8. Classical topological excitations; 9. Quantum topological excitations; 10. Duality, bosonization and generalized statistics; 11. Statistical transmutation; 12. Pseudo quantum electrodynamics; Part III. Quantum Field Theory Approach to Condensed Matter Systems: 13. Quantum field theory methods in condensed matter; 14. Metals, Fermi liquids, Mott and Anderson insulators; 15. The dynamics of polarons; 16. Polyacetylene; 17. The Kondo effect; 18. Quantum magnets in 1D: Fermionization, bosonization, Coulomb gases and 'all that'; 19. Quantum magnets in 2D: nonlinear sigma model, CP1 and 'all that'; 20. The spin-fermion system: a quantum field theory approach; 21. The spin glass; 22. Quantum field theory approach to superfluidity; 23. Quantum field theory approach to superconductivity; 24. The cuprate high-temperature superconductors; 25. The pnictides: iron based superconductors; 26. The quantum Hall effect; 27. Graphene; 28. Silicene and transition metal dichalcogenides; 29. Topological insulators; 30. Non-abelian statistics and quantum computation; References; Index.

  6. Attempting to physically explain space-time correlation of extremes

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Gailhard, Joel

    2010-05-01

    Spatial and temporal clustering of hydro-meteorological extreme events is scientific evidence. Moreover, the statistical parameters characterizing their local frequencies of occurrence show clear spatial patterns. Thus, in order to robustly assess the hydro-meteorological hazard, statistical models need to be able to take into account spatial and temporal dependencies. Statistical models considering long term correlation for quantifying and qualifying temporal and spatial dependencies are available, such as multifractal approach. Furthermore, the development of regional frequency analysis techniques allows estimating the frequency of occurrence of extreme events taking into account spatial patterns on the extreme quantiles behaviour. However, in order to understand the origin of spatio-temporal clustering, an attempt to find physical explanation should be done. Here, some statistical evidences of spatio-temporal correlation and spatial patterns of extreme behaviour are given on a large database of more than 400 rainfall and discharge series in France. In particular, the spatial distribution of multifractal and Generalized Pareto distribution parameters shows evident correlation patterns in the behaviour of frequency of occurrence of extremes. It is then shown that the identification of atmospheric circulation pattern (weather types) can physically explain the temporal clustering of extreme rainfall events (seasonality) and the spatial pattern of the frequency of occurrence. Moreover, coupling this information with the hydrological modelization of a watershed (as in the Schadex approach) an explanation of spatio-temporal distribution of extreme discharge can also be provided. We finally show that a hydro-meteorological approach (as the Schadex approach) can explain and take into account space and time dependencies of hydro-meteorological extreme events.

  7. The Role of Probability in Developing Learners' Models of Simulation Approaches to Inference

    ERIC Educational Resources Information Center

    Lee, Hollylynne S.; Doerr, Helen M.; Tran, Dung; Lovett, Jennifer N.

    2016-01-01

    Repeated sampling approaches to inference that rely on simulations have recently gained prominence in statistics education, and probabilistic concepts are at the core of this approach. In this approach, learners need to develop a mapping among the problem situation, a physical enactment, computer representations, and the underlying randomization…

  8. Evidence of the non-extensive character of Earth's ambient noise.

    NASA Astrophysics Data System (ADS)

    Koutalonis, Ioannis; Vallianatos, Filippos

    2017-04-01

    Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.

  9. Daily Physical Activity in Total Hip Arthroplasty Patients Undergoing Different Surgical Approaches: A Cohort Study.

    PubMed

    Engdal, Monika; Foss, Olav A; Taraldsen, Kristin; Husby, Vigdis S; Winther, Siri B

    2017-07-01

    Muscle weakness due to trauma from the surgical approach is anticipated to affect the ability of the patient to undertake daily physical activity early after total hip arthroplasty (THA). The objective of this study was to compare daily physical activity on days 1 to 4 after discharge, in patients following THA performed by 1 of 3 surgical approaches. A cohort study included 60 hip osteoarthritis patients, scheduled for THA, allocated to direct lateral approach, posterior approach, or anterior approach. Daily physical activity was measured by an accelerometer, with upright time per 24 hours as primary outcome and walking time, number of steps, and number of upright events per 24 hours as secondary outcomes. There were no statistically significant group differences in any of the measures of daily physical activity (P > 0.290) or between days of follow-up (P > 0.155). Overall, the median participant had 3.50 hours (interquartile range, 2.85-4.81 hours) of upright time, and participants showed wide variation in all outcomes of daily physical activity. There were no differences in daily physical activity between THA patients undergoing different surgical approaches. The surgical approach may not be a limiting factor for daily physical activity early after surgery in a fast-track treatment course.

  10. Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.

    PubMed

    Liu, Xinzijian; Liu, Jian

    2018-03-14

    An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.

  11. Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems

    NASA Astrophysics Data System (ADS)

    Liu, Xinzijian; Liu, Jian

    2018-03-01

    An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.

  12. Automated sampling assessment for molecular simulations using the effective sample size

    PubMed Central

    Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.

    2010-01-01

    To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418

  13. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

    DTIC Science & Technology

    2012-01-01

    discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI models such as, the...detection and discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI...Shubitidze of Sky Research and Dartmouth College, conceived, implemented , and tested most of the approaches presented in this report. He developed

  14. Consistency of extreme flood estimation approaches

    NASA Astrophysics Data System (ADS)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  15. Statistical physics of human beings in games: Controlled experiments

    NASA Astrophysics Data System (ADS)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  16. Bayesian statistics in radionuclide metrology: measurement of a decaying source

    NASA Astrophysics Data System (ADS)

    Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal

    2007-08-01

    The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.

  17. Statistical physics of vehicular traffic and some related systems

    NASA Astrophysics Data System (ADS)

    Chowdhury, Debashish; Santen, Ludger; Schadschneider, Andreas

    2000-05-01

    In the so-called “microscopic” models of vehicular traffic, attention is paid explicitly to each individual vehicle each of which is represented by a “particle”; the nature of the “interactions” among these particles is determined by the way the vehicles influence each others’ movement. Therefore, vehicular traffic, modeled as a system of interacting “particles” driven far from equilibrium, offers the possibility to study various fundamental aspects of truly nonequilibrium systems which are of current interest in statistical physics. Analytical as well as numerical techniques of statistical physics are being used to study these models to understand rich variety of physical phenomena exhibited by vehicular traffic. Some of these phenomena, observed in vehicular traffic under different circumstances, include transitions from one dynamical phase to another, criticality and self-organized criticality, metastability and hysteresis, phase-segregation, etc. In this critical review, written from the perspective of statistical physics, we explain the guiding principles behind all the main theoretical approaches. But we present detailed discussions on the results obtained mainly from the so-called “particle-hopping” models, particularly emphasizing those which have been formulated in recent years using the language of cellular automata.

  18. What High School Physics Teachers Teach: Results from the 2012-13 Nationwide Survey of High School Physics Teachers. Focus On

    ERIC Educational Resources Information Center

    Tyler, John; White, Susan

    2014-01-01

    During the 2012-13 academic year, the Statistic Research Center (SRC) collected data from a representative national sample of over 3,500 public and private high schools across the U.S. to inquire about physics availabilities and offerings. This report describes their findings. SRC takes two different approaches to describe the characteristics of…

  19. Rational design of vaccine targets and strategies for HIV: a crossroad of statistical physics, biology, and medicine.

    PubMed

    Chakraborty, Arup K; Barton, John P

    2017-03-01

    Vaccination has saved more lives than any other medical procedure. Pathogens have now evolved that have not succumbed to vaccination using the empirical paradigms pioneered by Pasteur and Jenner. Vaccine design strategies that are based on a mechanistic understanding of the pertinent immunology and virology are required to confront and eliminate these scourges. In this perspective, we describe just a few examples of work aimed to achieve this goal by bringing together approaches from statistical physics with biology and clinical research.

  20. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    PubMed

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  1. Rational design of vaccine targets and strategies for HIV: a crossroad of statistical physics, biology, and medicine

    NASA Astrophysics Data System (ADS)

    Chakraborty, Arup K.; Barton, John P.

    2017-03-01

    Vaccination has saved more lives than any other medical procedure. Pathogens have now evolved that have not succumbed to vaccination using the empirical paradigms pioneered by Pasteur and Jenner. Vaccine design strategies that are based on a mechanistic understanding of the pertinent immunology and virology are required to confront and eliminate these scourges. In this perspective, we describe just a few examples of work aimed to achieve this goal by bringing together approaches from statistical physics with biology and clinical research.

  2. Approaching Bose-Einstein Condensation

    ERIC Educational Resources Information Center

    Ferrari, Loris

    2011-01-01

    Bose-Einstein condensation (BEC) is discussed at the level of an advanced course of statistical thermodynamics, clarifying some formal and physical aspects that are usually not covered by the standard pedagogical literature. The non-conventional approach adopted starts by showing that the continuum limit, in certain cases, cancels out the crucial…

  3. A cloud and radiation model-based algorithm for rainfall retrieval from SSM/I multispectral microwave measurements

    NASA Technical Reports Server (NTRS)

    Xiang, Xuwu; Smith, Eric A.; Tripoli, Gregory J.

    1992-01-01

    A hybrid statistical-physical retrieval scheme is explored which combines a statistical approach with an approach based on the development of cloud-radiation models designed to simulate precipitating atmospheres. The algorithm employs the detailed microphysical information from a cloud model as input to a radiative transfer model which generates a cloud-radiation model database. Statistical procedures are then invoked to objectively generate an initial guess composite profile data set from the database. The retrieval algorithm has been tested for a tropical typhoon case using Special Sensor Microwave/Imager (SSM/I) data and has shown satisfactory results.

  4. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  5. Physics teaching in the medical schools of Taiwan.

    PubMed

    Hsu, Jiann-wien; Hsu, Roy

    2012-02-01

    We describe and analyze the statistics of general physics and laboratory courses in the medical schools of Taiwan. We explore the development of the general physics curriculum for medical students of Taiwan. Also, an approach to designing a general physics course in combination with its application to medical sciences is proposed. We hope this preliminary study can provide a useful reference for physics colleagues in the medical schools of Taiwan to revolutionize the dynamics of teaching physics to the medical students of Taiwan. Copyright © 2011. Published by Elsevier B.V.

  6. Observing fermionic statistics with photons in arbitrary processes

    PubMed Central

    Matthews, Jonathan C. F.; Poulios, Konstantinos; Meinecke, Jasmin D. A.; Politi, Alberto; Peruzzo, Alberto; Ismail, Nur; Wörhoff, Kerstin; Thompson, Mark G.; O'Brien, Jeremy L.

    2013-01-01

    Quantum mechanics defines two classes of particles-bosons and fermions-whose exchange statistics fundamentally dictate quantum dynamics. Here we develop a scheme that uses entanglement to directly observe the correlated detection statistics of any number of fermions in any physical process. This approach relies on sending each of the entangled particles through identical copies of the process and by controlling a single phase parameter in the entangled state, the correlated detection statistics can be continuously tuned between bosonic and fermionic statistics. We implement this scheme via two entangled photons shared across the polarisation modes of a single photonic chip to directly mimic the fermion, boson and intermediate behaviour of two-particles undergoing a continuous time quantum walk. The ability to simulate fermions with photons is likely to have applications for verifying boson scattering and for observing particle correlations in analogue simulation using any physical platform that can prepare the entangled state prescribed here. PMID:23531788

  7. A Humanistic Approach to Emotional Risk Management.

    ERIC Educational Resources Information Center

    Rubendall, Robert L.

    Adventure programs attempt to control or limit injuries in high-risk programming. This risk management has concentrated on the physical safety of participants at the expense of emotional and developmental security. In the zeal for accident-free statistics, a highly controlled, directive approach is created that treats individuals according to a…

  8. Identifying Student Resources in Reasoning about Entropy and the Approach to Thermal Equilibrium

    ERIC Educational Resources Information Center

    Loverude, Michael

    2015-01-01

    As part of an ongoing project to examine student learning in upper-division courses in thermal and statistical physics, we have examined student reasoning about entropy and the second law of thermodynamics. We have examined reasoning in terms of heat transfer, entropy maximization, and statistical treatments of multiplicity and probability. In…

  9. A novel approach for introducing cloud spatial structure into cloud radiative transfer parameterizations

    NASA Astrophysics Data System (ADS)

    Huang, Dong; Liu, Yangang

    2014-12-01

    Subgrid-scale variability is one of the main reasons why parameterizations are needed in large-scale models. Although some parameterizations started to address the issue of subgrid variability by introducing a subgrid probability distribution function for relevant quantities, the spatial structure has been typically ignored and thus the subgrid-scale interactions cannot be accounted for physically. Here we present a new statistical-physics-like approach whereby the spatial autocorrelation function can be used to physically capture the net effects of subgrid cloud interaction with radiation. The new approach is able to faithfully reproduce the Monte Carlo 3D simulation results with several orders less computational cost, allowing for more realistic representation of cloud radiation interactions in large-scale models.

  10. Combining Statistics and Physics to Improve Climate Downscaling

    NASA Astrophysics Data System (ADS)

    Gutmann, E. D.; Eidhammer, T.; Arnold, J.; Nowak, K.; Clark, M. P.

    2017-12-01

    Getting useful information from climate models is an ongoing problem that has plagued climate science and hydrologic prediction for decades. While it is possible to develop statistical corrections for climate models that mimic current climate almost perfectly, this does not necessarily guarantee that future changes are portrayed correctly. In contrast, convection permitting regional climate models (RCMs) have begun to provide an excellent representation of the regional climate system purely from first principles, providing greater confidence in their change signal. However, the computational cost of such RCMs prohibits the generation of ensembles of simulations or long time periods, thus limiting their applicability for hydrologic applications. Here we discuss a new approach combining statistical corrections with physical relationships for a modest computational cost. We have developed the Intermediate Complexity Atmospheric Research model (ICAR) to provide a climate and weather downscaling option that is based primarily on physics for a fraction of the computational requirements of a traditional regional climate model. ICAR also enables the incorporation of statistical adjustments directly within the model. We demonstrate that applying even simple corrections to precipitation while the model is running can improve the simulation of land atmosphere feedbacks in ICAR. For example, by incorporating statistical corrections earlier in the modeling chain, we permit the model physics to better represent the effect of mountain snowpack on air temperature changes.

  11. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  12. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  13. A Statistical Approach to Characterize and Detect degradation Within the Barabasi-Albert Network

    DTIC Science & Technology

    2016-09-01

    Astrophysics Collaborations 16046 932.11 0.4948 520.10|6 57 High-Energy Physics Theory Collaborations 7610 251.54 0.9481 -16.27|6 22 High-Energy Physics ...social, physical , and biological phenomena [6, 70, 16, 107]. The Barabási-Albert network exhibits the scale-free property and is highly desired for...It was noted that there has also been considerable effort in applying various types of graph entropies in the field of network physics . Mowshowitz and

  14. Gambling as a teaching aid in the introductory physics laboratory

    NASA Astrophysics Data System (ADS)

    Horodynski-Matsushigue, L. B.; Pascholati, P. R.; Vanin, V. R.; Dias, J. F.; Yoneama, M.-L.; Siqueira, P. T. D.; Amaku, M.; Duarte, J. L. M.

    1998-07-01

    Dice throwing is used to illustrate relevant concepts of the statistical theory of uncertainties, in particular the meaning of a limiting distribution, the standard deviation, and the standard deviation of the mean. It is an important part in a sequence of especially programmed laboratory activities, developed for freshmen, at the Institute of Physics of the University of São Paulo. It is shown how this activity is employed within a constructive teaching approach, which aims at a growing understanding of the measuring processes and of the fundamentals of correct statistical handling of experimental data.

  15. From Mechanical Motion to Brownian Motion, Thermodynamics and Particle Transport Theory

    ERIC Educational Resources Information Center

    Bringuier, E.

    2008-01-01

    The motion of a particle in a medium is dealt with either as a problem of mechanics or as a transport process in non-equilibrium statistical physics. The two kinds of approach are often unrelated as they are taught in different textbooks. The aim of this paper is to highlight the link between the mechanical and statistical treatments of particle…

  16. Combination of statistical and physically based methods to assess shallow slide susceptibility at the basin scale

    NASA Astrophysics Data System (ADS)

    Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel

    2017-07-01

    Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.

  17. Variational Approach in the Theory of Liquid-Crystal State

    NASA Astrophysics Data System (ADS)

    Gevorkyan, E. V.

    2018-03-01

    The variational calculus by Leonhard Euler is the basis for modern mathematics and theoretical physics. The efficiency of variational approach in statistical theory of liquid-crystal state and in general case in condensed state theory is shown. The developed approach in particular allows us to introduce correctly effective pair interactions and optimize the simple models of liquid crystals with help of realistic intermolecular potentials.

  18. Learning physical descriptors for materials science by compressed sensing

    NASA Astrophysics Data System (ADS)

    Ghiringhelli, Luca M.; Vybiral, Jan; Ahmetcik, Emre; Ouyang, Runhai; Levchenko, Sergey V.; Draxl, Claudia; Scheffler, Matthias

    2017-02-01

    The availability of big data in materials science offers new routes for analyzing materials properties and functions and achieving scientific understanding. Finding structure in these data that is not directly visible by standard tools and exploitation of the scientific information requires new and dedicated methodology based on approaches from statistical learning, compressed sensing, and other recent methods from applied mathematics, computer science, statistics, signal processing, and information science. In this paper, we explain and demonstrate a compressed-sensing based methodology for feature selection, specifically for discovering physical descriptors, i.e., physical parameters that describe the material and its properties of interest, and associated equations that explicitly and quantitatively describe those relevant properties. As showcase application and proof of concept, we describe how to build a physical model for the quantitative prediction of the crystal structure of binary compound semiconductors.

  19. A novel approach for introducing cloud spatial structure into cloud radiative transfer parameterizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Dong; Liu, Yangang

    2014-12-18

    Subgrid-scale variability is one of the main reasons why parameterizations are needed in large-scale models. Although some parameterizations started to address the issue of subgrid variability by introducing a subgrid probability distribution function for relevant quantities, the spatial structure has been typically ignored and thus the subgrid-scale interactions cannot be accounted for physically. Here we present a new statistical-physics-like approach whereby the spatial autocorrelation function can be used to physically capture the net effects of subgrid cloud interaction with radiation. The new approach is able to faithfully reproduce the Monte Carlo 3D simulation results with several orders less computational cost,more » allowing for more realistic representation of cloud radiation interactions in large-scale models.« less

  20. Nonclassical light revealed by the joint statistics of simultaneous measurements.

    PubMed

    Luis, Alfredo

    2016-04-15

    Nonclassicality cannot be a single-observable property, since the statistics of any quantum observable is compatible with classical physics. We develop a general procedure to reveal nonclassical behavior of light states from the joint statistics arising in the practical measurement of multiple observables. Beside embracing previous approaches, this protocol can disclose nonclassical features for standard examples of classical-like behavior, such as SU(2) and Glauber coherent states. When combined with other criteria, this would imply that every light state is nonclassical.

  1. The Value of Physical Examination: A New Conceptual Framework.

    PubMed

    Zaman, Junaid; Verghese, Abraham; Elder, Andrew

    2016-12-01

    The physical examination defines medical practice, yet its role is being questioned increasingly, with statistical comparisons of diagnostic accuracy often the sole metric used against newer technologies. We set out to highlight seven ways in which the physical examination has value beyond diagnostic accuracy to reaffirm its place in the core skills of a physician and guide future research, teaching, and curriculum design. We show that this more comprehensive approach to the physical examination of its "utility" beyond that of reaching a diagnosis can be beneficial to both doctor and patient.

  2. Stochastic analysis of experimentally determined physical parameters of HPMC:NiCl{sub 2} polymer composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thejas, Urs G.; Somashekar, R., E-mail: rs@physics.uni-mysore.ac.in; Sangappa, Y.

    A stochastic approach to explain the variation of physical parameters in polymer composites is discussed in this study. We have given a statistical model to derive the characteristic variation of physical parameters as a function of dopant concentration. Results of X-ray diffraction study and conductivity have been taken to validate this function, which can be extended to any of the physical parameters and polymer composites. For this study we have considered a polymer composites of HPMC doped with various concentrations of Nickel Chloride.

  3. The mean time-limited crash rate of stock price

    NASA Astrophysics Data System (ADS)

    Li, Yun-Xian; Li, Jiang-Cheng; Yang, Ai-Jun; Tang, Nian-Sheng

    2017-05-01

    In this article we investigate the occurrence of stock market crash in an economy cycle. Bayesian approach, Heston model and statistical-physical method are considered. Specifically, Heston model and an effective potential are employed to address the dynamic changes of stock price. Bayesian approach has been utilized to estimate the Heston model's unknown parameters. Statistical physical method is used to investigate the occurrence of stock market crash by calculating the mean time-limited crash rate. The real financial data from the Shanghai Composite Index is analyzed with the proposed methods. The mean time-limited crash rate of stock price is used to describe the occurrence of stock market crash in an economy cycle. The monotonous and nonmonotonous behaviors are observed in the behavior of the mean time-limited crash rate versus volatility of stock for various cross correlation coefficient between volatility and price. Also a minimum occurrence of stock market crash matching an optimal volatility is discovered.

  4. Vibroacoustic optimization using a statistical energy analysis model

    NASA Astrophysics Data System (ADS)

    Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia

    2016-08-01

    In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.

  5. Novel approaches to the study of particle dark matter in astrophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argüelles, C. R., E-mail: carlos.arguelles@icranet.org; Ruffini, R., E-mail: ruffini@icra.it; Rueda, J. A., E-mail: jorge.rueda@icra.it

    A deep understanding of the role of the dark matter in the different astrophysical scenarios of the local Universe such as galaxies, represent a crucial step to describe in a more consistent way the role of dark matter in cosmology. This kind of studies requires the interconnection between particle physics within and beyond the Standard Model, and fundamental physics such as thermodynamics and statistics, within a fully relativistic treatment of Gravity. After giving a comprehensive summary of the different types of dark matter and their role in astrophysics, we discuss the recent efforts in describing the distribution of dark mattermore » in the center and halo of galaxies from first principles such as gravitational interactions, quantum statistics and particle physics; and its implications with the observations.« less

  6. Statistical approaches to account for missing values in accelerometer data: Applications to modeling physical activity.

    PubMed

    Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki

    2018-04-01

    Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.

  7. On entropy, financial markets and minority games

    NASA Astrophysics Data System (ADS)

    Zapart, Christopher A.

    2009-04-01

    The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.

  8. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  9. Theoretical approaches to the steady-state statistical physics of interacting dissipative units

    NASA Astrophysics Data System (ADS)

    Bertin, Eric

    2017-02-01

    The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.

  10. Fermi liquid, clustering, and structure factor in dilute warm nuclear matter

    NASA Astrophysics Data System (ADS)

    Röpke, G.; Voskresensky, D. N.; Kryukov, I. A.; Blaschke, D.

    2018-02-01

    Properties of nuclear systems at subsaturation densities can be obtained from different approaches. We demonstrate the use of the density autocorrelation function which is related to the isothermal compressibility and, after integration, to the equation of state. This way we connect the Landau Fermi liquid theory well elaborated in nuclear physics with the approaches to dilute nuclear matter describing cluster formation. A quantum statistical approach is presented, based on the cluster decomposition of the polarization function. The fundamental quantity to be calculated is the dynamic structure factor. Comparing with the Landau Fermi liquid theory which is reproduced in lowest approximation, the account of bound state formation and continuum correlations gives the correct low-density result as described by the second virial coefficient and by the mass action law (nuclear statistical equilibrium). Going to higher densities, the inclusion of medium effects is more involved compared with other quantum statistical approaches, but the relation to the Landau Fermi liquid theory gives a promising approach to describe not only thermodynamic but also collective excitations and non-equilibrium properties of nuclear systems in a wide region of the phase diagram.

  11. Determining Functional Reliability of Pyrotechnic Mechanical Devices

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Multhaup, Herbert A.

    1997-01-01

    This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.

  12. Inverse statistical physics of protein sequences: a key issues review.

    PubMed

    Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin

    2018-03-01

    In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.

  13. Inverse statistical physics of protein sequences: a key issues review

    NASA Astrophysics Data System (ADS)

    Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin

    2018-03-01

    In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.

  14. Renormalization Group Tutorial

    NASA Technical Reports Server (NTRS)

    Bell, Thomas L.

    2004-01-01

    Complex physical systems sometimes have statistical behavior characterized by power- law dependence on the parameters of the system and spatial variability with no particular characteristic scale as the parameters approach critical values. The renormalization group (RG) approach was developed in the fields of statistical mechanics and quantum field theory to derive quantitative predictions of such behavior in cases where conventional methods of analysis fail. Techniques based on these ideas have since been extended to treat problems in many different fields, and in particular, the behavior of turbulent fluids. This lecture will describe a relatively simple but nontrivial example of the RG approach applied to the diffusion of photons out of a stellar medium when the photons have wavelengths near that of an emission line of atoms in the medium.

  15. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  16. Unit of Analysis: Impact of Silverman and Solmon's Article on Field-Based Intervention Research in Physical Education in the U.S.A.

    ERIC Educational Resources Information Center

    Li, Weidong; Chen, Yung-Ju; Xiang, Ping; Xie, Xiuge; Li, Yilin

    2017-01-01

    Purpose: The purposes of this study were to: (a) examine the impact of the Silverman and Solmon article (1998) on how researchers handle the unit of analysis issue in their field-based intervention research in physical education in the United States and summarize statistical approaches that have been used to analyze the data, and (b) provide…

  17. Understanding Short-Term Nonmigrating Tidal Variability in the Ionospheric Dynamo Region from SABER Using Information Theory and Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Kumari, K.; Oberheide, J.

    2017-12-01

    Nonmigrating tidal diagnostics of SABER temperature observations in the ionospheric dynamo region reveal a large amount of variability on time-scales of a few days to weeks. In this paper, we discuss the physical reasons for the observed short-term tidal variability using a novel approach based on Information theory and Bayesian statistics. We diagnose short-term tidal variability as a function of season, QBO, ENSO, and solar cycle and other drivers using time dependent probability density functions, Shannon entropy and Kullback-Leibler divergence. The statistical significance of the approach and its predictive capability is exemplified using SABER tidal diagnostics with emphasis on the responses to the QBO and solar cycle. Implications for F-region plasma density will be discussed.

  18. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  19. Networking—a statistical physics perspective

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho; Saad, David

    2013-03-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.

  20. A statistical dynamics approach to the study of human health data: Resolving population scale diurnal variation in laboratory data

    NASA Astrophysics Data System (ADS)

    Albers, D. J.; Hripcsak, George

    2010-02-01

    Statistical physics and information theory is applied to the clinical chemistry measurements present in a patient database containing 2.5 million patients' data over a 20-year period. Despite the seemingly naive approach of aggregating all patients over all times (with respect to particular clinical chemistry measurements), both a diurnal signal in the decay of the time-delayed mutual information and the presence of two sub-populations with differing health are detected. This provides a proof in principle that the highly fragmented data in electronic health records has potential for being useful in defining disease and human phenotypes.

  1. A Ground Flash Fraction Retrieval Algorithm for GLM

    NASA Technical Reports Server (NTRS)

    Koshak, William J.

    2010-01-01

    A Bayesian inversion method is introduced for retrieving the fraction of ground flashes in a set of N lightning observed by a satellite lightning imager (such as the Geostationary Lightning Mapper, GLM). An exponential model is applied as a physically reasonable constraint to describe the measured lightning optical parameter distributions. Population statistics (i.e., the mean and variance) are invoked to add additional constraints to the retrieval process. The Maximum A Posteriori (MAP) solution is employed. The approach is tested by performing simulated retrievals, and retrieval error statistics are provided. The approach is feasible for N greater than 2000, and retrieval errors decrease as N is increased.

  2. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  3. Explaining Gibbsean phase space to second year students

    NASA Astrophysics Data System (ADS)

    Vesely, Franz J.

    2005-03-01

    A new approach to teaching introductory statistical physics is presented. We recommend making extensive use of the fact that even systems with a very few degrees of freedom may display chaotic behaviour. This permits a didactic 'bottom-up' approach, starting out with toy systems whose phase space may be depicted on a screen or blackboard, then proceeding to ever higher dimensions in Gibbsean phase space.

  4. Graphene growth process modeling: a physical-statistical approach

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  5. Robust Statistical Detection of Power-Law Cross-Correlation.

    PubMed

    Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert

    2016-06-02

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  6. Robust Statistical Detection of Power-Law Cross-Correlation

    PubMed Central

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-01-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630

  7. A Complex Network Approach to Stylometry

    PubMed Central

    Amancio, Diego Raphael

    2015-01-01

    Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921

  8. Simulating Metabolism with Statistical Thermodynamics

    PubMed Central

    Cannon, William R.

    2014-01-01

    New methods are needed for large scale modeling of metabolism that predict metabolite levels and characterize the thermodynamics of individual reactions and pathways. Current approaches use either kinetic simulations, which are difficult to extend to large networks of reactions because of the need for rate constants, or flux-based methods, which have a large number of feasible solutions because they are unconstrained by the law of mass action. This report presents an alternative modeling approach based on statistical thermodynamics. The principles of this approach are demonstrated using a simple set of coupled reactions, and then the system is characterized with respect to the changes in energy, entropy, free energy, and entropy production. Finally, the physical and biochemical insights that this approach can provide for metabolism are demonstrated by application to the tricarboxylic acid (TCA) cycle of Escherichia coli. The reaction and pathway thermodynamics are evaluated and predictions are made regarding changes in concentration of TCA cycle intermediates due to 10- and 100-fold changes in the ratio of NAD+:NADH concentrations. Finally, the assumptions and caveats regarding the use of statistical thermodynamics to model non-equilibrium reactions are discussed. PMID:25089525

  9. Simulating metabolism with statistical thermodynamics.

    PubMed

    Cannon, William R

    2014-01-01

    New methods are needed for large scale modeling of metabolism that predict metabolite levels and characterize the thermodynamics of individual reactions and pathways. Current approaches use either kinetic simulations, which are difficult to extend to large networks of reactions because of the need for rate constants, or flux-based methods, which have a large number of feasible solutions because they are unconstrained by the law of mass action. This report presents an alternative modeling approach based on statistical thermodynamics. The principles of this approach are demonstrated using a simple set of coupled reactions, and then the system is characterized with respect to the changes in energy, entropy, free energy, and entropy production. Finally, the physical and biochemical insights that this approach can provide for metabolism are demonstrated by application to the tricarboxylic acid (TCA) cycle of Escherichia coli. The reaction and pathway thermodynamics are evaluated and predictions are made regarding changes in concentration of TCA cycle intermediates due to 10- and 100-fold changes in the ratio of NAD+:NADH concentrations. Finally, the assumptions and caveats regarding the use of statistical thermodynamics to model non-equilibrium reactions are discussed.

  10. An order statistics approach to the halo model for galaxies

    NASA Astrophysics Data System (ADS)

    Paul, Niladri; Paranjape, Aseem; Sheth, Ravi K.

    2017-04-01

    We use the halo model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models - one in which this luminosity function p(L) is universal - naturally produces a number of features associated with previous analyses based on the 'central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the lognormal distribution around this mean and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering; however, this model predicts no luminosity dependence of large-scale clustering. We then show that an extended version of this model, based on the order statistics of a halo mass dependent luminosity function p(L|m), is in much better agreement with the clustering data as well as satellite luminosities, but systematically underpredicts central luminosities. This brings into focus the idea that central galaxies constitute a distinct population that is affected by different physical processes than are the satellites. We model this physical difference as a statistical brightening of the central luminosities, over and above the order statistics prediction. The magnitude gap between the brightest and second brightest group galaxy is predicted as a by-product, and is also in good agreement with observations. We propose that this order statistics framework provides a useful language in which to compare the halo model for galaxies with more physically motivated galaxy formation models.

  11. Understanding immunology: fun at an intersection of the physical, life, and clinical sciences

    NASA Astrophysics Data System (ADS)

    Chakraborty, Arup K.

    2014-10-01

    Understanding how the immune system works is a grand challenge in science with myriad direct implications for improving human health. The immune system protects us from infectious pathogens and cancer, and maintains a harmonious steady state with essential microbiota in our gut. Vaccination, the medical procedure that has saved more lives than any other, involves manipulating the immune system. Unfortunately, the immune system can also go awry to cause autoimmune diseases. Immune responses are the product of stochastic collective dynamic processes involving many interacting components. These processes span multiple scales of length and time. Thus, statistical mechanics has much to contribute to immunology, and the oeuvre of biological physics will be further enriched if the number of physical scientists interested in immunology continues to increase. I describe how I got interested in immunology and provide a glimpse of my experiences working on immunology using approaches from statistical mechanics and collaborating closely with immunologists.

  12. Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Bunin, Guy; Mehta, Pankaj

    2018-03-01

    A central question in ecology is to understand the ecological processes that shape community structure. Niche-based theories have emphasized the important role played by competition for maintaining species diversity. Many of these insights have been derived using MacArthur’s consumer resource model (MCRM) or its generalizations. Most theoretical work on the MCRM has focused on small ecosystems with a few species and resources. However theoretical insights derived from small ecosystems many not scale up to large ecosystems with many resources and species because large systems with many interacting components often display new emergent behaviors that cannot be understood or deduced from analyzing smaller systems. To address these shortcomings, we develop a statistical physics inspired cavity method to analyze MCRM when both the number of species and the number of resources is large. Unlike previous work in this limit, our theory addresses resource dynamics and resource depletion and demonstrates that species generically and consistently perturb their environments and significantly modify available ecological niches. We show how our cavity approach naturally generalizes niche theory to large ecosystems by accounting for the effect of collective phenomena on species invasion and ecological stability. Our theory suggests that such phenomena are a generic feature of large, natural ecosystems and must be taken into account when analyzing and interpreting community structure. It also highlights the important role that statistical-physics inspired approaches can play in furthering our understanding of ecology.

  13. Using entropy to cut complex time series

    NASA Astrophysics Data System (ADS)

    Mertens, David; Poncela Casasnovas, Julia; Spring, Bonnie; Amaral, L. A. N.

    2013-03-01

    Using techniques from statistical physics, physicists have modeled and analyzed human phenomena varying from academic citation rates to disease spreading to vehicular traffic jams. The last decade's explosion of digital information and the growing ubiquity of smartphones has led to a wealth of human self-reported data. This wealth of data comes at a cost, including non-uniform sampling and statistically significant but physically insignificant correlations. In this talk I present our work using entropy to identify stationary sub-sequences of self-reported human weight from a weight management web site. Our entropic approach-inspired by the infomap network community detection algorithm-is far less biased by rare fluctuations than more traditional time series segmentation techniques. Supported by the Howard Hughes Medical Institute

  14. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  15. Statistical physics approach to quantifying differences in myelinated nerve fibers

    PubMed Central

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-01-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross–sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146

  16. Statistical physics of media processes: Mediaphysics

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Dmitri V.; Mandel, Igor

    2007-04-01

    The processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific subbranch-“mediaphysics”-are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a “person's mind” and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (influential propagation of opinions, “word of mouth”, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-spin-based sociophysical models and kinetic-equation ones. The distributions were described by a Schrödinger-type equation in terms of Green's functions. The developed approach has been applied to a real mass-media efficiency problem for a large company and generally demonstrated very good results despite low initial correlations of factors and the target variable.

  17. Statistical physics approach to quantifying differences in myelinated nerve fibers

    NASA Astrophysics Data System (ADS)

    Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene

    2014-03-01

    We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum.

  18. Physical and Hydrological Meaning of the Spectral Information from Hydrodynamic Signals at Karst Springs

    NASA Astrophysics Data System (ADS)

    Dufoyer, A.; Lecoq, N.; Massei, N.; Marechal, J. C.

    2017-12-01

    Physics-based modeling of karst systems remains almost impossible without enough accurate information about the inner physical characteristics. Usually, the only available hydrodynamic information is the flow rate at the karst outlet. Numerous works in the past decades have used and proven the usefulness of time-series analysis and spectral techniques applied to spring flow, precipitations or even physico-chemical parameters, for interpreting karst hydrological functioning. However, identifying or interpreting the karst systems physical features that control statistical or spectral characteristics of spring flow variations is still challenging, not to say sometimes controversial. The main objective of this work is to determine how the statistical and spectral characteristics of the hydrodynamic signal at karst springs can be related to inner physical and hydraulic properties. In order to address this issue, we undertake an empirical approach based on the use of both distributed and physics-based models, and on synthetic systems responses. The first step of the research is to conduct a sensitivity analysis of time-series/spectral methods to karst hydraulic and physical properties. For this purpose, forward modeling of flow through several simple, constrained and synthetic cases in response to precipitations is undertaken. It allows us to quantify how the statistical and spectral characteristics of flow at the outlet are sensitive to changes (i) in conduit geometries, and (ii) in hydraulic parameters of the system (matrix/conduit exchange rate, matrix hydraulic conductivity and storativity). The flow differential equations resolved by MARTHE, a computer code developed by the BRGM, allows karst conduits modeling. From signal processing on simulated spring responses, we hope to determine if specific frequencies are always modified, thanks to Fourier series and multi-resolution analysis. We also hope to quantify which parameters are the most variable with auto-correlation analysis: first results seem to show higher variations due to conduit conductivity than the ones due to matrix/conduit exchange rate. Future steps will be using another computer code, based on double-continuum approach and allowing turbulent conduit flow, and modeling a natural system.

  19. Global Precipitation Measurement (GPM) Ground Validation (GV) Science Implementation Plan

    NASA Technical Reports Server (NTRS)

    Petersen, Walter A.; Hou, Arthur Y.

    2008-01-01

    For pre-launch algorithm development and post-launch product evaluation Global Precipitation Measurement (GPM) Ground Validation (GV) goes beyond direct comparisons of surface rain rates between ground and satellite measurements to provide the means for improving retrieval algorithms and model applications.Three approaches to GPM GV include direct statistical validation (at the surface), precipitation physics validation (in a vertical columns), and integrated science validation (4-dimensional). These three approaches support five themes: core satellite error characterization; constellation satellites validation; development of physical models of snow, cloud water, and mixed phase; development of cloud-resolving model (CRM) and land-surface models to bridge observations and algorithms; and, development of coupled CRM-land surface modeling for basin-scale water budget studies and natural hazard prediction. This presentation describes the implementation of these approaches.

  20. Promoting Physical Activity Among Native American Youth: a Systematic Review of the Methodology and Current Evidence of Physical Activity Interventions and Community-wide Initiatives.

    PubMed

    Fleischhacker, Sheila; Roberts, Erica; Camplain, Ricky; Evenson, Kelly R; Gittelsohn, Joel

    2016-12-01

    Promoting physical activity using environmental, policy, and systems approaches could potentially address persistent health disparities faced by American Indian and Alaska Native children and adolescents. To address research gaps and help inform tribally led community changes that promote physical activity, this review examined the methodology and current evidence of physical activity interventions and community-wide initiatives among Native youth. A keyword-guided search was conducted in multiple databases to identify peer-reviewed research articles that reported on physical activity among Native youth. Ultimately, 20 unique interventions (described in 76 articles) and 13 unique community-wide initiatives (described in 16 articles) met the study criteria. Four interventions noted positive changes in knowledge and attitude relating to physical activity but none of the interventions examined reported statistically significant improvements on weight-related outcomes. Only six interventions reported implementing environmental, policy, and system approaches relating to promoting physical activity and generally only shared anecdotal information about the approaches tried. Using community-based participatory research or tribally driven research models strengthened the tribal-research partnerships and improved the cultural and contextual sensitivity of the intervention or community-wide initiative. Few interventions or community-wide initiatives examined multi-level, multi-sector interventions to promote physical activity among Native youth, families, and communities. More research is needed to measure and monitor physical activity within this understudied, high risk group. Future research could also focus on the unique authority and opportunity of tribal leaders and other key stakeholders to use environmental, policy, and systems approaches to raise a healthier generation of Native youth.

  1. Promoting physical activity among Native American youth: A systematic review of the methodology and current evidence of physical activity interventions and community-wide initiatives

    PubMed Central

    Roberts, Erica; Camplain, Ricky; Evenson, Kelly R.; Gittelsohn, Joel

    2015-01-01

    Promoting physical activity using environmental, policy, and systems approaches could potentially address persistent health disparities faced by American Indian and Alaska Native children and adolescents. To address research gaps and help inform tribally-led community changes that promote physical activity, this review examined the methodology and current evidence of physical activity interventions and community-wide initiatives among Native youth. A keyword guided search was conducted in multiple databases to identify peer-reviewed research articles that reported on physical activity among Native youth. Ultimately, 20 unique interventions (described in 76 articles) and 13 unique community-wide initiatives (described in 16 articles) met the study criteria. Four interventions noted positive changes in knowledge and attitude relating to physical activity but none of the interventions examined reported statistically significant improvements on weight-related outcomes. Only six interventions reported implementing environmental, policy, and system approaches relating to promoting physical activity and generally only shared anecdotal information about the approaches tried. Using community-based participatory research or tribally-driven research models strengthened the tribal-research partnerships and improved the cultural and contextual sensitivity of the intervention or community-wide initiative. Few interventions or community-wide initiatives examined multi-level, multi-sector interventions to promote physical activity among Native youth, families and communities. More research is needed to measure and monitor physical activity within this understudied, high risk group. Future research could also focus on the unique authority and opportunity of tribal leaders and other key stakeholders to use environmental, policy, and systems approaches to raise a healthier generation of Native youth. PMID:27294756

  2. A distance learning model in a physical therapy curriculum.

    PubMed

    English, T; Harrison, A L; Hart, A L

    1998-01-01

    In response to the rural health initiative established in 1991, the University of Kentucky has developed an innovative distance learning program of physical therapy instruction that combines classroom lecture and discussion via compressed video technology with laboratory experiences. The authors describe the process of planning, implementing, and evaluating a specific distance learning course in pathomechanics for the professional-level master's-degree physical therapy students at the University of Kentucky. This presentation may serve as a model for teaching distance learning. Descriptions of optimal approaches to preclass preparation, scheduling, course delivery, use of audiovisual aids, use of handout material, and video production are given. Special activities that may enhance or deter the achievement of the learning objectives are outlined, and a problem-solving approach to common problems encountered is presented. An approach to evaluating and comparing course outcomes for the distance learnere is presented. For this particular course, there was no statistically significant difference in the outcome measures utilized to compare the distance learners with the on-site learners.

  3. Modelling 1-minute directional observations of the global irradiance.

    NASA Astrophysics Data System (ADS)

    Thejll, Peter; Pagh Nielsen, Kristian; Andersen, Elsa; Furbo, Simon

    2016-04-01

    Direct and diffuse irradiances from the sky has been collected at 1-minute intervals for about a year from the experimental station at the Technical University of Denmark for the IEA project "Solar Resource Assessment and Forecasting". These data were gathered by pyrheliometers tracking the Sun, as well as with apertured pyranometers gathering 1/8th and 1/16th of the light from the sky in 45 degree azimuthal ranges pointed around the compass. The data are gathered in order to develop detailed models of the potentially available solar energy and its variations at high temporal resolution in order to gain a more detailed understanding of the solar resource. This is important for a better understanding of the sub-grid scale cloud variation that cannot be resolved with climate and weather models. It is also important for optimizing the operation of active solar energy systems such as photovoltaic plants and thermal solar collector arrays, and for passive solar energy and lighting to buildings. We present regression-based modelling of the observed data, and focus, here, on the statistical properties of the model fits. Using models based on the one hand on what is found in the literature and on physical expectations, and on the other hand on purely statistical models, we find solutions that can explain up to 90% of the variance in global radiation. The models leaning on physical insights include terms for the direct solar radiation, a term for the circum-solar radiation, a diffuse term and a term for the horizon brightening/darkening. The purely statistical model is found using data- and formula-validation approaches picking model expressions from a general catalogue of possible formulae. The method allows nesting of expressions, and the results found are dependent on and heavily constrained by the cross-validation carried out on statistically independent testing and training data-sets. Slightly better fits -- in terms of variance explained -- is found using the purely statistical fitting/searching approach. We describe the methods applied, results found, and discuss the different potentials of the physics- and statistics-only based model-searches.

  4. [Effects of an educational program for the reduction of physical restraint use by caregivers in geriatric hospitals].

    PubMed

    Choi, Keumbong; Kim, Jinsun

    2009-12-01

    The purposes of this study were to develop an educational program to reduce the use of physical restraints for caregivers in geriatric hospitals and to evaluate the effects of the program on caregivers' knowledge, attitude and nursing practice related to the use of physical restraints. A quasi experimental study with a non-equivalent control group pretest-posttest design was used. Participants were recruited from two geriatric hospitals. Eighteen caregivers were assigned to the experimental group and 20 to the control group. The data were collected prior to the intervention and at 6 weeks after the intervention through the use of self-administered questionnaires. Descriptive statistics, X(2) test, Fisher's exact probability test, and Mann-Whitney U test were used to analyze the data. After the intervention, knowledge about physical restraints increased significantly in experimental group compared to the control group. However, there were no statistically significant differences between the groups for attitude and nursing practice involving physical restraints. Findings indicate that it is necessary to apply knowledge acquired through educational programs to nursing practice to reduce the use of physical restraints. User friendly guidelines for physical restraints, administrative support of institutions, and multidisciplinary approaches are required to achieve this goal.

  5. Stochastic modeling of sunshine number data

    NASA Astrophysics Data System (ADS)

    Brabec, Marek; Paulescu, Marius; Badescu, Viorel

    2013-11-01

    In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation of Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.

  6. Stochastic modeling of sunshine number data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brabec, Marek, E-mail: mbrabec@cs.cas.cz; Paulescu, Marius; Badescu, Viorel

    2013-11-13

    In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation ofmore » Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.« less

  7. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach.

    PubMed

    Chertkov, Michael; Chernyak, Vladimir

    2017-08-17

    Thermostatically controlled loads, e.g., air conditioners and heaters, are by far the most widespread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control - changing from on to off, and vice versa, depending on temperature. We considered aggregation of a large group of similar devices into a statistical ensemble, where the devices operate following the same dynamics, subject to stochastic perturbations and randomized, Poisson on/off switching policy. Using theoretical and computational tools of statistical physics, we analyzed how the ensemble relaxes to a stationary distribution and established a relationship between the relaxation and the statistics of the probability flux associated with devices' cycling in the mixed (discrete, switch on/off, and continuous temperature) phase space. This allowed us to derive the spectrum of the non-equilibrium (detailed balance broken) statistical system and uncover how switching policy affects oscillatory trends and the speed of the relaxation. Relaxation of the ensemble is of practical interest because it describes how the ensemble recovers from significant perturbations, e.g., forced temporary switching off aimed at utilizing the flexibility of the ensemble to provide "demand response" services to change consumption temporarily to balance a larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.

  8. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE PAGES

    Chertkov, Michael; Chernyak, Vladimir

    2017-01-17

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  9. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael; Chernyak, Vladimir

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  10. A Localized Ensemble Kalman Smoother

    NASA Technical Reports Server (NTRS)

    Butala, Mark D.

    2012-01-01

    Numerous geophysical inverse problems prove difficult because the available measurements are indirectly related to the underlying unknown dynamic state and the physics governing the system may involve imperfect models or unobserved parameters. Data assimilation addresses these difficulties by combining the measurements and physical knowledge. The main challenge in such problems usually involves their high dimensionality and the standard statistical methods prove computationally intractable. This paper develops and addresses the theoretical convergence of a new high-dimensional Monte-Carlo approach called the localized ensemble Kalman smoother.

  11. An interdisciplinary school project using a Nintendo Wii controller for measuring car speed

    NASA Astrophysics Data System (ADS)

    Hansen, Nils Kristian; Mitchell, James Robert

    2013-03-01

    This work examines the feasibility of employing a Nintendo Wii game controller for measuring car speed in an interdisciplinary school project. It discusses the physical characteristics of the controller and of vehicle headlights. It suggests how an experiment may be linked to topics in mathematics, statistics, physics and computer science. An algorithm for calculating speed from repeated recordings of car headlights is provided. Finally the results of repeated experiments with an approaching car are provided.

  12. Manual Physical Therapy Following Immobilization for Stable Ankle Fracture: A Case Series.

    PubMed

    Painter, Elizabeth E; Deyle, Gail D; Allen, Christopher; Petersen, Evan J; Croy, Theodore; Rivera, Kenneth P

    2015-09-01

    Case series. Ankle fractures commonly result in persistent pain, stiffness, and functional impairments. There is insufficient evidence to favor any particular rehabilitation approach after ankle fracture. The purpose of this case series was to describe an impairment-based manual physical therapy approach to treating patients with conservatively managed ankle fractures. Patients with stable ankle fractures postimmobilization were treated with manual physical therapy and exercise targeted at associated impairments in the lower limb. The primary outcome measure was the Lower Extremity Functional Scale. Secondary outcome measures included the ankle lunge test, numeric pain-rating scale, and global rating of change. Outcome measures were collected at baseline (performed within 7 days of immobilization removal) and at 4 and 12 weeks postbaseline. Eleven patients (mean age, 39.6 years; range, 18-64 years; 2 male), after ankle fracture-related immobilization (mean duration, 48 days; range, 21-75 days), were treated for an average of 6.6 sessions (range, 3-10 sessions) over a mean of 46.1 days (range, 13-81 days). Compared to baseline, statistically significant and clinically meaningful improvements were observed in Lower Extremity Functional Scale score (P = .001; mean change, 21.9 points; 95% confidence interval: 10.4, 33.4) and in the ankle lunge test (P = .001; mean change, 7.8 cm; 95% confidence interval: 3.9, 11.7) at 4 weeks. These changes persisted at 12 weeks. Statistically significant and clinically meaningful improvements in self-reported function and ankle range of motion were observed at 4 and 12 weeks following treatment with impairment-based manual physical therapy. All patients tolerated treatment well. Results suggest that this approach may have efficacy in this population. Therapy, level 4.

  13. A hybrid hydrologically complemented warning model for shallow landslides induced by extreme rainfall in Korean Mountain

    NASA Astrophysics Data System (ADS)

    Singh Pradhan, Ananta Man; Kang, Hyo-Sub; Kim, Yun-Tae

    2016-04-01

    This study uses a physically based approach to evaluate the factor of safety of the hillslope for different hydrological conditions, in Mt Umyeon, south of Seoul. The hydrological conditions were determined using intensity and duration of whole Korea of known landslide inventory data. Quantile regression statistical method was used to ascertain different probability warning levels on the basis of rainfall thresholds. Physically based models are easily interpreted and have high predictive capabilities but rely on spatially explicit and accurate parameterization, which is commonly not possible. Statistical probabilistic methods can include other causative factors which influence the slope stability such as forest, soil and geology, but rely on good landslide inventories of the site. In this study a hybrid approach has described that combines the physically-based landslide susceptibility for different hydrological conditions. A presence-only based maximum entropy model was used to hybrid and analyze relation of landslide with conditioning factors. About 80% of the landslides were listed among the unstable sites identified in the proposed model, thereby presenting its effectiveness and accuracy in determining unstable areas and areas that require evacuation. These cumulative rainfall thresholds provide a valuable reference to guide disaster prevention authorities in the issuance of warning levels with the ability to reduce losses and save lives.

  14. An application of statistical mechanics for representing equilibrium perimeter distributions of tropical convective clouds

    NASA Astrophysics Data System (ADS)

    Garrett, T. J.; Alva, S.; Glenn, I. B.; Krueger, S. K.

    2015-12-01

    There are two possible approaches for parameterizing sub-grid cloud dynamics in a coarser grid model. The most common is to use a fine scale model to explicitly resolve the mechanistic details of clouds to the best extent possible, and then to parameterize these behaviors cloud state for the coarser grid. A second is to invoke physical intuition and some very general theoretical principles from equilibrium statistical mechanics. This approach avoids any requirement to resolve time-dependent processes in order to arrive at a suitable solution. The second approach is widely used elsewhere in the atmospheric sciences: for example the Planck function for blackbody radiation is derived this way, where no mention is made of the complexities of modeling a large ensemble of time-dependent radiation-dipole interactions in order to obtain the "grid-scale" spectrum of thermal emission by the blackbody as a whole. We find that this statistical approach may be equally suitable for modeling convective clouds. Specifically, we make the physical argument that the dissipation of buoyant energy in convective clouds is done through mixing across a cloud perimeter. From thermodynamic reasoning, one might then anticipate that vertically stacked isentropic surfaces are characterized by a power law dlnN/dlnP = -1, where N(P) is the number clouds of perimeter P. In a Giga-LES simulation of convective clouds within a 100 km square domain we find that such a power law does appear to characterize simulated cloud perimeters along isentropes, provided a sufficient cloudy sample. The suggestion is that it may be possible to parameterize certain important aspects of cloud state without appealing to computationally expensive dynamic simulations.

  15. Finding Bounded Rational Equilibria. Part 1; Iterative Focusing

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights from the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  16. Emergent dynamic structures and statistical law in spherical lattice gas automata.

    PubMed

    Yao, Zhenwei

    2017-12-01

    Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.

  17. Emergent dynamic structures and statistical law in spherical lattice gas automata

    NASA Astrophysics Data System (ADS)

    Yao, Zhenwei

    2017-12-01

    Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.

  18. Teachers' approaches to teaching physics

    NASA Astrophysics Data System (ADS)

    2012-12-01

    Benjamin Franklin said, "Tell me, and I forget. Teach me, and I remember. Involve me, and I learn." He would not be surprised to learn that research in physics pedagogy has consistently shown that the traditional lecture is the least effective teaching method for teaching physics. We asked high school physics teachers which teaching activities they used in their classrooms. While almost all teachers still lecture sometimes, two-thirds use something other than lecture most of the time. The five most often-used activities are shown in the table below. In the January issue, we will look at the 2013 Nationwide Survey of High School Physics teachers. Susan White is Research Manager in the Statistical Research Center at the American Institute of Physics; she directs the Nationwide Survey of High School Physics Teachers. If you have any questions, please contact Susan at swhite@aip.org.

  19. Mapping cognitive structures of community college students engaged in basic electrostatics laboratories

    NASA Astrophysics Data System (ADS)

    Haggerty, Dennis Charles

    Community college students need to be abstract thinkers in order to be successful in the introductory Physics curriculum. The purpose of this dissertation is to map the abstract thinking of community college Physics students. The laboratory environment was used as a vehicle for the mapping. Three laboratory experiments were encountered. One laboratory was based on the classic Piagetian task, the centripetal motion (CM) problem. The other two laboratories were introductory electrostatic Physics experiments, Resistance (RES) and Capacitance (CAP). The students performed all laboratories using the thinking-aloud technique. The researcher collected their verbal protocols using audiotapes. The audiotaped data was quantified by comparing it to a scoring matrix based on the Piagetian logical operators (Inhelder & Piaget, 1958) for abstract thinking. The students received scores for each laboratory experiment. These scores were compared to a reliable test of intellectual functioning, the Shipley Institute of Living Scale (SILS). Spearman rank correlation coefficients (SRCC) were obtained for SILS versus CM; SILS versus RES; and SILS versus CAP. Statistically significant results were obtained for SILS versus CM and SILS versus RES at the p < 0.05 level. When an outlier to the data was considered and suppressed, the SILS versus CAP was also statistically significant at the p < 0.05 level. The scoring matrix permits a bridge from the qualitative Piagetian level of cognitive development to a quantified, mapped level of cognitive development. The ability to quantify student abstract thinking in Physics education provides a means to adjust an instructional approach. This approach could lead to a proper state of Physics education.

  20. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  1. Exploring the relationship between the engineering and physical sciences and the health and life sciences by advanced bibliometric methods.

    PubMed

    Waltman, Ludo; van Raan, Anthony F J; Smart, Sue

    2014-01-01

    We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the 'EPS-HLS interface' is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade.

  2. Exploring the Relationship between the Engineering and Physical Sciences and the Health and Life Sciences by Advanced Bibliometric Methods

    PubMed Central

    Waltman, Ludo; van Raan, Anthony F. J.; Smart, Sue

    2014-01-01

    We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the ‘EPS-HLS interface’ is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade. PMID:25360616

  3. The Statistical Segment Length of DNA: Opportunities for Biomechanical Modeling in Polymer Physics and Next-Generation Genomics.

    PubMed

    Dorfman, Kevin D

    2018-02-01

    The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.

  4. Evidence of nonextensive statistical physics behavior in the watershed distribution in active tectonic areas: examples from Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos; Kouli, Maria

    2013-08-01

    The Digital Elevation Model (DEM) for the Crete Island with a resolution of approximately 20 meters was used in order to delineate watersheds by computing the flow direction and using it in the Watershed function. The Watershed function uses a raster of flow direction to determine contributing area. The Geographic Information Systems routine procedure was applied and the watersheds as well as the streams network (using a threshold of 2000 cells, i.e. the minimum number of cells that constitute a stream) were extracted from the hydrologically corrected (free of sinks) DEM. A number of a few thousand watersheds were delineated, and their areal extent was calculated. From these watersheds a number of 300 was finally selected for further analysis as the watersheds of extremely small area were excluded in order to avoid possible artifacts. Our analysis approach is based on the basic principles of Complexity theory and Tsallis Entropy introduces in the frame of non-extensive statistical physics. This concept has been successfully used for the analysis of a variety of complex dynamic systems including natural hazards, where fractality and long-range interactions are important. The analysis indicates that the statistical distribution of watersheds can be successfully described with the theoretical estimations of non-extensive statistical physics implying the complexity that characterizes the occurrences of them.

  5. The economic burden of physical inactivity: a systematic review and critical appraisal.

    PubMed

    Ding, Ding; Kolbe-Alexander, Tracy; Nguyen, Binh; Katzmarzyk, Peter T; Pratt, Michael; Lawson, Kenny D

    2017-10-01

    To summarise the literature on the economic burden of physical inactivity in populations, with emphases on appraising the methodologies and providing recommendations for future studies. Systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines (PROSPERO registration number CRD42016047705). Electronic databases for peer-reviewed and grey literature were systematically searched, followed by reference searching and consultation with experts. Studies that examined the economic consequences of physical inactivity in a population/population-based sample, with clearly stated methodologies and at least an abstract/summary written in English. Of the 40 eligible studies, 27 focused on direct healthcare costs only, 13 also estimated indirect costs and one study additionally estimated household costs. For direct costs, 23 studies used a population attributable fraction (PAF) approach with estimated healthcare costs attributable to physical inactivity ranging from 0.3% to 4.6% of national healthcare expenditure; 17 studies used an econometric approach, which tended to yield higher estimates than those using a PAF approach. For indirect costs, 10 studies used a human capital approach, two used a friction cost approach and one used a value of a statistical life approach. Overall, estimates varied substantially, even within the same country, depending on analytical approaches, time frame and other methodological considerations. Estimating the economic burden of physical inactivity is an area of increasing importance that requires further development. There is a marked lack of consistency in methodological approaches and transparency of reporting. Future studies could benefit from cross-disciplinary collaborations involving economists and physical activity experts, taking a societal perspective and following best practices in conducting and reporting analysis, including accounting for potential confounding, reverse causality and comorbidity, applying discounting and sensitivity analysis, and reporting assumptions, limitations and justifications for approaches taken. We have adapted the Consolidated Health Economic Evaluation Reporting Standards checklist as a guide for future estimates of the economic burden of physical inactivity and other risk factors. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Thermal machines beyond the weak coupling regime

    NASA Astrophysics Data System (ADS)

    Gallego, R.; Riera, A.; Eisert, J.

    2014-12-01

    How much work can be extracted from a heat bath using a thermal machine? The study of this question has a very long history in statistical physics in the weak-coupling limit, when applied to macroscopic systems. However, the assumption that thermal heat baths remain uncorrelated with associated physical systems is less reasonable on the nano-scale and in the quantum setting. In this work, we establish a framework of work extraction in the presence of quantum correlations. We show in a mathematically rigorous and quantitative fashion that quantum correlations and entanglement emerge as limitations to work extraction compared to what would be allowed by the second law of thermodynamics. At the heart of the approach are operations that capture the naturally non-equilibrium dynamics encountered when putting physical systems into contact with each other. We discuss various limits that relate to known results and put our work into the context of approaches to finite-time quantum thermodynamics.

  7. Stochastic Spatial Models in Ecology: A Statistical Physics Approach

    NASA Astrophysics Data System (ADS)

    Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.

    2018-07-01

    Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.

  8. Stochastic Spatial Models in Ecology: A Statistical Physics Approach

    NASA Astrophysics Data System (ADS)

    Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.

    2017-11-01

    Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.

  9. Statistical Physics of Adaptation

    DTIC Science & Technology

    2016-08-23

    Statistical Physics of Adaptation Nikolay Perunov, Robert A. Marsland, and Jeremy L. England Department of Physics , Physics of Living Systems Group...Subject Areas: Biological Physics , Complex Systems, Statistical Physics I. INTRODUCTION It has long been understood that nonequilibrium driving can...equilibrium may appear to have been specially selected for physical properties connected to their ability to absorb work from the particular driving environment

  10. Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models

    USGS Publications Warehouse

    Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.

    2011-01-01

    We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.

  11. Modern morphometry: new perspectives in physical anthropology.

    PubMed

    Mantini, Simone; Ripani, Maurizio

    2009-06-01

    In the past one hundred years physical anthropology has recourse to more and more efficient methods, which provide several new information regarding, human evolution and biology. Apart from the molecular approach, the introduction of new computed assisted techniques gave rise to a new concept of morphometry. Computed tomography and 3D-imaging, allowed providing anatomical description of the external and inner structures exceeding the problems encountered with the traditional morphometric methods. Furthermore, the support of geometric morphometrics, allowed creating geometric models to investigate morphological variation in terms of evolution, ontogeny and variability. The integration of these new tools gave rise to the virtual anthropology and to a new image of the anthropologist in which anatomical, biological, mathematical statistical and data processing information are fused in a multidisciplinary approach.

  12. A new statistical approach to climate change detection and attribution

    NASA Astrophysics Data System (ADS)

    Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe

    2017-01-01

    We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).

  13. Student laboratory reports: an approach to improving feedback and quality

    NASA Astrophysics Data System (ADS)

    Ellingsen, Pål Gunnar; Støvneng, Jon Andreas

    2018-05-01

    We present an ongoing effort in improving the quality of laboratory reports written by first and second year physics students. The effort involves a new approach where students are given the opportunity to submit reports at intermediate deadlines, receive feedback, and then resubmit for the final deadline. In combination with a differential grading system, instead of pass/fail, the improved feedback results in higher quality reports. Improvement in the quality of the reports is visible through the grade statistics.

  14. Statistical representation of a spray as a point process

    NASA Astrophysics Data System (ADS)

    Subramaniam, S.

    2000-10-01

    The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed.

  15. A methodological approach to short-term tracking of youth physical fitness: the Oporto Growth, Health and Performance Study.

    PubMed

    Souza, Michele; Eisenmann, Joey; Chaves, Raquel; Santos, Daniel; Pereira, Sara; Forjaz, Cláudia; Maia, José

    2016-10-01

    In this paper, three different statistical approaches were used to investigate short-term tracking of cardiorespiratory and performance-related physical fitness among adolescents. Data were obtained from the Oporto Growth, Health and Performance Study and comprised 1203 adolescents (549 girls) divided into two age cohorts (10-12 and 12-14 years) followed for three consecutive years, with annual assessment. Cardiorespiratory fitness was assessed with 1-mile run/walk test; 50-yard dash, standing long jump, handgrip, and shuttle run test were used to rate performance-related physical fitness. Tracking was expressed in three different ways: auto-correlations, multilevel modelling with crude and adjusted model (for biological maturation, body mass index, and physical activity), and Cohen's Kappa (κ) computed in IBM SPSS 20.0, HLM 7.01 and Longitudinal Data Analysis software, respectively. Tracking of physical fitness components was (1) moderate-to-high when described by auto-correlations; (2) low-to-moderate when crude and adjusted models were used; and (3) low according to Cohen's Kappa (κ). These results demonstrate that when describing tracking, different methods should be considered since they provide distinct and more comprehensive views about physical fitness stability patterns.

  16. Fundamental properties of fracture and seismicity in a non extensive statistical physics framework.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2010-05-01

    A fundamental challenge in many scientific disciplines concerns upscaling, that is, of determining the regularities and laws of evolution at some large scale, from those known at a lower scale. Earthquake physics is no exception, with the challenge of understanding the transition from the laboratory scale to the scale of fault networks and large earthquakes. In this context, statistical physics has a remarkably successful work record in addressing the upscaling problem in physics. It is natural then to consider that the physics of many earthquakes has to be studied with a different approach than the physics of one earthquake and in this sense we can consider the use of statistical physics not only appropriate but necessary to understand the collective properties of earthquakes [see Corral 2004, 2005a,b,c;]. A significant attempt is given in a series of works [Main 1996; Rundle et al., 1997; Main et al., 2000; Main and Al-Kindy, 2002; Rundle et al., 2003; Vallianatos and Triantis, 2008a] that uses classical statistical physics to describe seismicity. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from fracture level to seismicity scale?? The application of non extensive statistical physics offers a consistent theoretical framework, based on a generalization of entropy, to analyze the behavior of natural systems with fractal or multi-fractal distribution of their elements. Such natural systems where long - range interactions or intermittency are important, lead to power law behavior. We note that this is consistent with a classical thermodynamic approach to natural systems that rapidly attain equilibrium, leading to exponential-law behavior. In the frame of non extensive statistical physics approach, the probability function p(X) is calculated using the maximum entropy formulation of Tsallis entropy which involves the introduction of at least two constraints (Tsallis et al., 1998). The first one is the classical normalization of p(X). The second one is based on the definition of the expectation value which has to be generalized to the "q-expectation value", according to the generalization of the entropy [Abe and Suzuki, 2003]. In order to calculate p(X) we apply the technique of Langrange multipliers maximizing an appropriate functional and leading tο maximization of the Tsallis entropy under the constraints on the normalization and the q-expectation value. It is well known that the Gutenberg-Richter (G-R) power law distribution has to be modified for large seismic moments because of energy conservation and geometrical reasons. Several models have been proposed, either in terms of a second power law with a larger b value beyond a crossover magnitude, or based on a magnidute cut-off using an exponential taper. In the present work we point out that the non extensivity viewpoint is applicable to seismic processes. In the frame of a non-extensive approach which is based on Tsallis entropy we construct a generalized expression of Gutenberg-Richter (GGR) law [Vallianatos, 2008]. The existence of lower or/and upper bound to magnitude is discussed and the conditions under which GGR lead to classical GR law are analysed. For the lowest earthquake size (i.e., energy level) the correlation between the different parts of elements involved in the evolution of an earthquake are short-ranged and GR can be deduced on the basis of the maximum entropy principle using BG statistics. As the size (i.e., energy) increases, long range correlation becomes much more important, implying the necessity of using Tsallis entropy as an appropriate generalization of BG entropy. The power law behaviour is derived as a special case, leading to b-values being functions of the non-extensivity parameter q. Furthermore a theoretical analysis of similarities presented in stress stimulated electric and acoustic emissions and earthquakes are discussed not only in the frame of GGR but taking into account a universality in the description of intrevent times distribution. Its particular form can be well expressed in the frame of a non extensive approach. This formulation is very different from an exponential distribution expected for simple random Poisson processes and indicates the existence of a nontrivial universal mechanism in the generation process. All the aforementioned similarities within stress stimulated electrical and acoustic emissions and seismicity suggests a connection with fracture phenomena at much larger scales implying that a basic general mechanism is "actively hidden" behind all this phenomena [Vallianatos and Triantis, 2008b]. Examples from S.Aegean seismicity are given. Acknowledgements: This work is partially supported by the "NEXT EARTH" project FP7-PEOPLE, 2009-2011 References Abe S. and Suzuki N., J. Goephys. Res. 108 (B2), 2113, 2003. Corral A., Phys. Rev. Lett. 92, 108501, 2004. Corral A., Nonlinear Proc. Geophys. 12, 89, 2005a. Corral A., Phys. Rev. E 71, 017101, 2005b. Corral A., Phys. Rev. Lett. 95, 028501, 2005c. Main I. G., Rev. of Geoph., 34, 433, 1996. Main I. G., O' Brien G. And Henderson R., J. Geoph. Res., 105, 6105, 2000. Main I. G. and Al-Kindy F. H., Geoph. Res. Let., 29, 7, 2002. Rundle J. B., Gross S., Klein W., Fergunson C. and Turcotte D., Tectonophysics, 277, 147-164, 1997. Rundle J. B., Turcotte D. L., Shcherbakov R., Klein W. and Sammis C., Rev. Geophys. 41, 1019, 2003. Tsallis C., J. Stat. Phys. 52, 479, 1988; See also http://tsallis.cat.cbpf.br/biblio.htm for an updated bibliography. Vallianatos, F., 2th IASME/WSEAS International Conference on Geology and Seismology (GES08), Cambridge, U.K, 2008. Vallianatos F. and Triantis D., Physica A, 387, 4940-4946, 2008a.

  17. Finding Bounded Rational Equilibria. Part 2; Alternative Lagrangians and Uncountable Move Spaces

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights &om the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  18. Understanding amyloid aggregation by statistical analysis of atomic force microscopy images

    NASA Astrophysics Data System (ADS)

    Adamcik, Jozef; Jung, Jin-Mi; Flakowski, Jérôme; de Los Rios, Paolo; Dietler, Giovanni; Mezzenga, Raffaele

    2010-06-01

    The aggregation of proteins is central to many aspects of daily life, including food processing, blood coagulation, eye cataract formation disease and prion-related neurodegenerative infections. However, the physical mechanisms responsible for amyloidosis-the irreversible fibril formation of various proteins that is linked to disorders such as Alzheimer's, Creutzfeldt-Jakob and Huntington's diseases-have not yet been fully elucidated. Here, we show that different stages of amyloid aggregation can be examined by performing a statistical polymer physics analysis of single-molecule atomic force microscopy images of heat-denatured β-lactoglobulin fibrils. The atomic force microscopy analysis, supported by theoretical arguments, reveals that the fibrils have a multistranded helical shape with twisted ribbon-like structures. Our results also indicate a possible general model for amyloid fibril assembly and illustrate the potential of this approach for investigating fibrillar systems.

  19. Neuronal couplings between retinal ganglion cells inferred by efficient inverse statistical physics methods

    PubMed Central

    Cocco, Simona; Leibler, Stanislas; Monasson, Rémi

    2009-01-01

    Complexity of neural systems often makes impracticable explicit measurements of all interactions between their constituents. Inverse statistical physics approaches, which infer effective couplings between neurons from their spiking activity, have been so far hindered by their computational complexity. Here, we present 2 complementary, computationally efficient inverse algorithms based on the Ising and “leaky integrate-and-fire” models. We apply those algorithms to reanalyze multielectrode recordings in the salamander retina in darkness and under random visual stimulus. We find strong positive couplings between nearby ganglion cells common to both stimuli, whereas long-range couplings appear under random stimulus only. The uncertainty on the inferred couplings due to limitations in the recordings (duration, small area covered on the retina) is discussed. Our methods will allow real-time evaluation of couplings for large assemblies of neurons. PMID:19666487

  20. Discrete approach to stochastic parametrization and dimension reduction in nonlinear dynamics.

    PubMed

    Chorin, Alexandre J; Lu, Fei

    2015-08-11

    Many physical systems are described by nonlinear differential equations that are too complicated to solve in full. A natural way to proceed is to divide the variables into those that are of direct interest and those that are not, formulate solvable approximate equations for the variables of greater interest, and use data and statistical methods to account for the impact of the other variables. In the present paper we consider time-dependent problems and introduce a fully discrete solution method, which simplifies both the analysis of the data and the numerical algorithms. The resulting time series are identified by a NARMAX (nonlinear autoregression moving average with exogenous input) representation familiar from engineering practice. The connections with the Mori-Zwanzig formalism of statistical physics are discussed, as well as an application to the Lorenz 96 system.

  1. Book Review:

    NASA Astrophysics Data System (ADS)

    Vespignani, A.

    2004-09-01

    Networks have been recently recognized as playing a central role in understanding a wide range of systems spanning diverse scientific domains such as physics and biology, economics, computer science and information technology. Specific examples run from the structure of the Internet and the World Wide Web to the interconnections of finance agents and ecological food webs. These networked systems are generally made by many components whose microscopic interactions give rise to global structures characterized by emergent collective behaviour and complex topological properties. In this context the statistical physics approach finds a natural application since it attempts to explain the various large-scale statistical properties of networks in terms of local interactions governing the dynamical evolution of the constituent elements of the system. It is not by chance then that many of the seminal papers in the field have been published in the physics literature, and have nevertheless made a considerable impact on other disciplines. Indeed, a truly interdisciplinary approach is required in order to understand each specific system of interest, leading to a very interesting cross-fertilization between different scientific areas defining the emergence of a new research field sometimes called network science. The book of Dorogovtsev and Mendes is the first comprehensive monograph on this new scientific field. It provides a thorough presentation of the forefront research activities in the area of complex networks, with an extensive sampling of the disciplines involved and the kinds of problems that form the subject of inquiry. The book starts with a short introduction to graphs and network theory that introduces the tools and mathematical background needed for the rest of the book. The following part is devoted to an extensive presentation of the empirical analysis of real-world networks. While for obvious reasons of space the authors cannot analyse in every detail all the various examples, they provide the reader with a general vista that makes clear the relevance of network science to a wide range of natural and man-made systems. Two chapters are then committed to the detailed exposition of the statistical physics approach to equilibrium and non-equilibrium networks. The authors are two leading players in the area of network theory and offer a very careful and complete presentation of the statistical physics theory of evolving networks. Finally, in the last two chapters, the authors focus on various consequences of network topology for dynamical and physical phenomena occurring in these kinds of structures. The book is completed by a very extensive bibliography and some useful appendices containing some technical points arising in the mathematical discussion and data analysis. The book's mathematical level is fairly advanced and allows a coherent and unified framework for the study of networked structure. The book is targeted at mathematicians, physicists and social scientists alike. It will be appreciated by everybody working in the network area, and especially by any researcher or student entering the field that would like to have a reference text on the latest developments in network science.

  2. Symplectic approach to calculation of magnetic field line trajectories in physical space with realistic magnetic geometry in divertor tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punjabi, Alkesh; Ali, Halima

    A new approach to integration of magnetic field lines in divertor tokamaks is proposed. In this approach, an analytic equilibrium generating function (EGF) is constructed in natural canonical coordinates ({psi},{theta}) from experimental data from a Grad-Shafranov equilibrium solver for a tokamak. {psi} is the toroidal magnetic flux and {theta} is the poloidal angle. Natural canonical coordinates ({psi},{theta},{phi}) can be transformed to physical position (R,Z,{phi}) using a canonical transformation. (R,Z,{phi}) are cylindrical coordinates. Another canonical transformation is used to construct a symplectic map for integration of magnetic field lines. Trajectories of field lines calculated from this symplectic map in natural canonicalmore » coordinates can be transformed to trajectories in real physical space. Unlike in magnetic coordinates [O. Kerwin, A. Punjabi, and H. Ali, Phys. Plasmas 15, 072504 (2008)], the symplectic map in natural canonical coordinates can integrate trajectories across the separatrix surface, and at the same time, give trajectories in physical space. Unlike symplectic maps in physical coordinates (x,y) or (R,Z), the continuous analog of a symplectic map in natural canonical coordinates does not distort trajectories in toroidal planes intervening the discrete map. This approach is applied to the DIII-D tokamak [J. L. Luxon and L. E. Davis, Fusion Technol. 8, 441 (1985)]. The EGF for the DIII-D gives quite an accurate representation of equilibrium magnetic surfaces close to the separatrix surface. This new approach is applied to demonstrate the sensitivity of stochastic broadening using a set of perturbations that generically approximate the size of the field errors and statistical topological noise expected in a poloidally diverted tokamak. Plans for future application of this approach are discussed.« less

  3. Symplectic approach to calculation of magnetic field line trajectories in physical space with realistic magnetic geometry in divertor tokamaks

    NASA Astrophysics Data System (ADS)

    Punjabi, Alkesh; Ali, Halima

    2008-12-01

    A new approach to integration of magnetic field lines in divertor tokamaks is proposed. In this approach, an analytic equilibrium generating function (EGF) is constructed in natural canonical coordinates (ψ,θ) from experimental data from a Grad-Shafranov equilibrium solver for a tokamak. ψ is the toroidal magnetic flux and θ is the poloidal angle. Natural canonical coordinates (ψ,θ,φ) can be transformed to physical position (R,Z,φ) using a canonical transformation. (R,Z,φ) are cylindrical coordinates. Another canonical transformation is used to construct a symplectic map for integration of magnetic field lines. Trajectories of field lines calculated from this symplectic map in natural canonical coordinates can be transformed to trajectories in real physical space. Unlike in magnetic coordinates [O. Kerwin, A. Punjabi, and H. Ali, Phys. Plasmas 15, 072504 (2008)], the symplectic map in natural canonical coordinates can integrate trajectories across the separatrix surface, and at the same time, give trajectories in physical space. Unlike symplectic maps in physical coordinates (x,y) or (R,Z), the continuous analog of a symplectic map in natural canonical coordinates does not distort trajectories in toroidal planes intervening the discrete map. This approach is applied to the DIII-D tokamak [J. L. Luxon and L. E. Davis, Fusion Technol. 8, 441 (1985)]. The EGF for the DIII-D gives quite an accurate representation of equilibrium magnetic surfaces close to the separatrix surface. This new approach is applied to demonstrate the sensitivity of stochastic broadening using a set of perturbations that generically approximate the size of the field errors and statistical topological noise expected in a poloidally diverted tokamak. Plans for future application of this approach are discussed.

  4. Dependency properties of the amorphous alloy Co58Ni10Fe5Si11B16 on technological parameters of spinning

    NASA Astrophysics Data System (ADS)

    Frolov, A. M.; Tkachev, V. V.; Fedorets, A. N.; Pustovalov, E. V.; Kraynova, G. S.; Dolzhikov, S. V.; Ilin, N. V.; Tsesarskaya, A. K.

    2017-09-01

    The tapes are quickly quenched onto a rotating drum. The structure of mechanical and physical properties is studied depending on the spinning parameters. An approach is proposed for the classification of obtained bands based on the statistics of the microrelief of their surfaces.

  5. A Pedagogical Approach to the Boltzmann Factor through Experiments and Simulations

    ERIC Educational Resources Information Center

    Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.

    2009-01-01

    The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to…

  6. Algorithmic complexity of real financial markets

    NASA Astrophysics Data System (ADS)

    Mansilla, R.

    2001-12-01

    A new approach to the understanding of complex behavior of financial markets index using tools from thermodynamics and statistical physics is developed. Physical complexity, a quantity rooted in the Kolmogorov-Chaitin theory is applied to binary sequences built up from real time series of financial markets indexes. The study is based on NASDAQ and Mexican IPC data. Different behaviors of this quantity are shown when applied to the intervals of series placed before crashes and to intervals when no financial turbulence is observed. The connection between our results and the efficient market hypothesis is discussed.

  7. Objective determination of image end-members in spectral mixture analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Tompkins, Stefanie; Mustard, John F.; Pieters, Carle M.; Forsyth, Donald W.

    1993-01-01

    Spectral mixture analysis has been shown to be a powerful, multifaceted tool for analysis of multi- and hyper-spectral data. Applications of AVIRIS data have ranged from mapping soils and bedrock to ecosystem studies. During the first phase of the approach, a set of end-members are selected from an image cube (image end-members) that best account for its spectral variance within a constrained, linear least squares mixing model. These image end-members are usually selected using a priori knowledge and successive trial and error solutions to refine the total number and physical location of the end-members. However, in many situations a more objective method of determining these essential components is desired. We approach the problem of image end-member determination objectively by using the inherent variance of the data. Unlike purely statistical methods such as factor analysis, this approach derives solutions that conform to a physically realistic model.

  8. The applications of statistical quantification techniques in nanomechanics and nanoelectronics.

    PubMed

    Mai, Wenjie; Deng, Xinwei

    2010-10-08

    Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f(0). The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 +/- 0.0274 x 10( - 5) ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.

  9. Brief communication: Skeletal biology past and present: Are we moving in the right direction?

    PubMed

    Hens, Samantha M; Godde, Kanya

    2008-10-01

    In 1982, Spencer's edited volume A History of American Physical Anthropology: 1930-1980 allowed numerous authors to document the state of our science, including a critical examination of skeletal biology. Some authors argued that the first 50 years of skeletal biology were characterized by the descriptive-historical approach with little regard for processual problems and that technological and statistical analyses were not rooted in theory. In an effort to determine whether Spencer's landmark volume impacted the field of skeletal biology, a content analysis was carried out for the American Journal of Physical Anthropology from 1980 to 2004. The percentage of skeletal biology articles is similar to that of previous decades. Analytical articles averaged only 32% and are defined by three criteria: statistical analysis, hypothesis testing, and broader explanatory context. However, when these criteria were scored individually, nearly 80% of papers attempted a broader theoretical explanation, 44% tested hypotheses, and 67% used advanced statistics, suggesting that the skeletal biology papers in the journal have an analytical emphasis. Considerable fluctuation exists between subfields; trends toward a more analytical approach are witnessed in the subfields of age/sex/stature/demography, skeletal maturation, anatomy, and nonhuman primate studies, which also increased in frequency, while paleontology and pathology were largely descriptive. Comparisons to the International Journal of Osteoarchaeology indicate that there are statistically significant differences between the two journals in terms of analytical criteria. These data indicate a positive shift in theoretical thinking, i.e., an attempt by most to explain processes rather than present a simple description of events.

  10. An investigation into the effectiveness of problem-based learning in a physical chemistry laboratory course

    NASA Astrophysics Data System (ADS)

    Gürses, Ahmet; Açıkyıldız, Metin; Doğar, Çetin; Sözbilir, Mustafa

    2007-04-01

    The aim of this study was to investigate the effectiveness of a problem-based learning (PBL) approach in a physical chemistry laboratory course. The parameters investigated were students’ attitudes towards a chemistry laboratory course, scientific process skills of students and their academic achievement. The design of the study was one group pre-test post-test. Four experiments, covering the topics adsorption, viscosity, surface tension and conductivity were performed using a PBL approach in the fall semester of the 2003/04 academic year at Kazim Karabekir Education Faculty of Atatürk University. Each experiment was done over a three week period. A total of 40 students, 18 male and 22 female, participated in the study. Students took the Physical Chemistry Laboratory Concept Test (PCLCT), Attitudes towards Chemistry Laboratory (ATCL) questionnaire and Science Process Skills Test (SPST) as pre and post-tests. In addition, the effectiveness of the PBL approach was also determined through four different scales; Scales Specific to Students’ Views of PBL. A statistically significant difference between the students’ academic achievement and scientific process skills at p

  11. Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics

    NASA Astrophysics Data System (ADS)

    Lazarus, S. M.; Holman, B. P.; Splitt, M. E.

    2017-12-01

    A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.

  12. Statistical and engineering methods for model enhancement

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Jung

    Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.

  13. Middle school science curriculum design and 8th grade student achievement in Massachusetts public schools

    NASA Astrophysics Data System (ADS)

    Clifford, Betsey A.

    The Massachusetts Department of Elementary and Secondary Education (DESE) released proposed Science and Technology/Engineering standards in 2013 outlining the concepts that should be taught at each grade level. Previously, standards were in grade spans and each district determined the method of implementation. There are two different methods used teaching middle school science: integrated and discipline-based. In the proposed standards, the Massachusetts DESE uses grade-by-grade standards using an integrated approach. It was not known if there is a statistically significant difference in student achievement on the 8th grade science MCAS assessment for students taught with an integrated or discipline-based approach. The results on the 8th grade science MCAS test from six public school districts from 2010 -- 2013 were collected and analyzed. The methodology used was quantitative. Results of an ANOVA showed that there was no statistically significant difference in overall student achievement between the two curriculum models. Furthermore, there was no statistically significant difference for the various domains: Earth and Space Science, Life Science, Physical Science, and Technology/Engineering. This information is useful for districts hesitant to make the change from a discipline-based approach to an integrated approach. More research should be conducted on this topic with a larger sample size to better support the results.

  14. Simplified Approach to Predicting Rough Surface Transition

    NASA Technical Reports Server (NTRS)

    Boyle, Robert J.; Stripf, Matthias

    2009-01-01

    Turbine vane heat transfer predictions are given for smooth and rough vanes where the experimental data show transition moving forward on the vane as the surface roughness physical height increases. Consiste nt with smooth vane heat transfer, the transition moves forward for a fixed roughness height as the Reynolds number increases. Comparison s are presented with published experimental data. Some of the data ar e for a regular roughness geometry with a range of roughness heights, Reynolds numbers, and inlet turbulence intensities. The approach ta ken in this analysis is to treat the roughness in a statistical sense , consistent with what would be obtained from blades measured after e xposure to actual engine environments. An approach is given to determ ine the equivalent sand grain roughness from the statistics of the re gular geometry. This approach is guided by the experimental data. A roughness transition criterion is developed, and comparisons are made with experimental data over the entire range of experimental test co nditions. Additional comparisons are made with experimental heat tran sfer data, where the roughness geometries are both regular as well a s statistical. Using the developed analysis, heat transfer calculatio ns are presented for the second stage vane of a high pressure turbine at hypothetical engine conditions.

  15. Statistical physics of crime: a review.

    PubMed

    D'Orsogna, Maria R; Perc, Matjaž

    2015-03-01

    Containing the spread of crime in urban societies remains a major challenge. Empirical evidence suggests that, if left unchecked, crimes may be recurrent and proliferate. On the other hand, eradicating a culture of crime may be difficult, especially under extreme social circumstances that impair the creation of a shared sense of social responsibility. Although our understanding of the mechanisms that drive the emergence and diffusion of crime is still incomplete, recent research highlights applied mathematics and methods of statistical physics as valuable theoretical resources that may help us better understand criminal activity. We review different approaches aimed at modeling and improving our understanding of crime, focusing on the nucleation of crime hotspots using partial differential equations, self-exciting point process and agent-based modeling, adversarial evolutionary games, and the network science behind the formation of gangs and large-scale organized crime. We emphasize that statistical physics of crime can relevantly inform the design of successful crime prevention strategies, as well as improve the accuracy of expectations about how different policing interventions should impact malicious human activity that deviates from social norms. We also outline possible directions for future research, related to the effects of social and coevolving networks and to the hierarchical growth of criminal structures due to self-organization. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A Hybrid Physics-Based Data-Driven Approach for Point-Particle Force Modeling

    NASA Astrophysics Data System (ADS)

    Moore, Chandler; Akiki, Georges; Balachandar, S.

    2017-11-01

    This study improves upon the physics-based pairwise interaction extended point-particle (PIEP) model. The PIEP model leverages a physical framework to predict fluid mediated interactions between solid particles. While the PIEP model is a powerful tool, its pairwise assumption leads to increased error in flows with high particle volume fractions. To reduce this error, a regression algorithm is used to model the differences between the current PIEP model's predictions and the results of direct numerical simulations (DNS) for an array of monodisperse solid particles subjected to various flow conditions. The resulting statistical model and the physical PIEP model are superimposed to construct a hybrid, physics-based data-driven PIEP model. It must be noted that the performance of a pure data-driven approach without the model-form provided by the physical PIEP model is substantially inferior. The hybrid model's predictive capabilities are analyzed using more DNS. In every case tested, the hybrid PIEP model's prediction are more accurate than those of physical PIEP model. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1315138 and the U.S. DOE, NNSA, ASC Program, as a Cooperative Agreement under Contract No. DE-NA0002378.

  17. Equilibrium statistical-thermal models in high-energy physics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser

    2014-05-01

    We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical-thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.

  18. Trends and associated uncertainty in the global mean temperature record

    NASA Astrophysics Data System (ADS)

    Poppick, A. N.; Moyer, E. J.; Stein, M.

    2016-12-01

    Physical models suggest that the Earth's mean temperature warms in response to changing CO2 concentrations (and hence increased radiative forcing); given physical uncertainties in this relationship, the historical temperature record is a source of empirical information about global warming. A persistent thread in many analyses of the historical temperature record, however, is the reliance on methods that appear to deemphasize both physical and statistical assumptions. Examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for natural variability in nonparametric rather than parametric ways. We show here that methods that deemphasize assumptions can limit the scope of analysis and can lead to misleading inferences, particularly in the setting considered where the data record is relatively short and the scale of temporal correlation is relatively long. A proposed model that is simple but physically informed provides a more reliable estimate of trends and allows a broader array of questions to be addressed. In accounting for uncertainty, we also illustrate how parametric statistical models that are attuned to the important characteristics of natural variability can be more reliable than ostensibly more flexible approaches.

  19. Resistive switching phenomena: A review of statistical physics approaches

    DOE PAGES

    Lee, Jae Sung; Lee, Shinbuhm; Noh, Tae Won

    2015-08-31

    Here we report that resistive switching (RS) phenomena are reversible changes in the metastable resistance state induced by external electric fields. After discovery ~50 years ago, RS phenomena have attracted great attention due to their potential application in next-generation electrical devices. Considerable research has been performed to understand the physical mechanisms of RS and explore the feasibility and limits of such devices. There have also been several reviews on RS that attempt to explain the microscopic origins of how regions that were originally insulators can change into conductors. However, little attention has been paid to the most important factor inmore » determining resistance: how conducting local regions are interconnected. Here, we provide an overview of the underlying physics behind connectivity changes in highly conductive regions under an electric field. We first classify RS phenomena according to their characteristic current–voltage curves: unipolar, bipolar, and threshold switchings. Second, we outline the microscopic origins of RS in oxides, focusing on the roles of oxygen vacancies: the effect of concentration, the mechanisms of channel formation and rupture, and the driving forces of oxygen vacancies. Third, we review RS studies from the perspective of statistical physics to understand connectivity change in RS phenomena. We discuss percolation model approaches and the theory for the scaling behaviors of numerous transport properties observed in RS. Fourth, we review various switching-type conversion phenomena in RS: bipolar-unipolar, memory-threshold, figure-of-eight, and counter-figure-of-eight conversions. Finally, we review several related technological issues, such as improvement in high resistance fluctuations, sneak-path problems, and multilevel switching problems.« less

  20. Thermalization and prethermalization in isolated quantum systems: a theoretical overview

    NASA Astrophysics Data System (ADS)

    Mori, Takashi; Ikeda, Tatsuhiko N.; Kaminishi, Eriko; Ueda, Masahito

    2018-06-01

    The approach to thermal equilibrium, or thermalization, in isolated quantum systems is among the most fundamental problems in statistical physics. Recent theoretical studies have revealed that thermalization in isolated quantum systems has several remarkable features, which emerge from quantum entanglement and are quite distinct from those in classical systems. Experimentally, well isolated and highly controllable ultracold quantum gases offer an ideal testbed to study the nonequilibrium dynamics in isolated quantum systems, promoting intensive recent theoretical endeavors on this fundamental subject. Besides thermalization, many isolated quantum systems show intriguing behavior in relaxation processes, especially prethermalization. Prethermalization occurs when there is a clear separation of relevant time scales and has several different physical origins depending on individual systems. In this review, we overview theoretical approaches to the problems of thermalization and prethermalization.

  1. Sculpting bespoke mountains: Determining free energies with basis expansions

    NASA Astrophysics Data System (ADS)

    Whitmer, Jonathan K.; Fluitt, Aaron M.; Antony, Lucas; Qin, Jian; McGovern, Michael; de Pablo, Juan J.

    2015-07-01

    The intriguing behavior of a wide variety of physical systems, ranging from amorphous solids or glasses to proteins, is a direct manifestation of underlying free energy landscapes riddled with local minima separated by large barriers. Exploring such landscapes has arguably become one of statistical physics's great challenges. A new method is proposed here for uniform sampling of rugged free energy surfaces. The method, which relies on special Green's functions to approximate the Dirac delta function, improves significantly on existing simulation techniques by providing a boundary-agnostic approach that is capable of mapping complex features in multidimensional free energy surfaces. The usefulness of the proposed approach is established in the context of a simple model glass former and model proteins, demonstrating improved convergence and accuracy over existing methods.

  2. Fluctuations in energy loss and their implications for dosimetry and radiobiology

    NASA Technical Reports Server (NTRS)

    Baily, N. A.; Steigerwalt, J. E.

    1972-01-01

    Serious consideration of the physics of energy deposition indicates that a fundamental change in the interpretation of absorbed dose is required at least for considerations of effects in biological systems. In addition, theoretical approaches to radiobiology and microdosimetry seem to require statistical considerations incorporating frequency distributions of the magnitude of the event sizes within the volume of interest.

  3. Statistical physics of vaccination

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  4. The number comb for a soil physical properties dynamic measurement

    NASA Astrophysics Data System (ADS)

    Olechko, K.; Patiño, P.; Tarquis, A. M.

    2012-04-01

    We propose the prime numbers distribution extracted from the soil digital multiscale images and some physical properties time series as the precise indicator of the spatial and temporal dynamics under soil management changes. With this new indicator the soil dynamics can be studied as a critical phenomenon where each phase transition is estimated and modeled by the graph partitioning induced phase transition. The critical point of prime numbers distribution was correlated with the beginning of Andosols, Vertisols and saline soils physical degradation under the unsustainable soil management in Michoacan, Guanajuato and Veracruz States of Mexico. The data banks corresponding to the long time periods (between 10 and 28 years) were statistically compared by RISK 5.0 software and our own algorithms. Our approach makes us able to distill free-form natural laws of soils physical properties dynamics directly from the experimental data. The Richter (1987) and Schmidt and Lipson (2009) original approaches were very useful to design the algorithms to identify Hamiltonians, Lagrangians and other laws of geometric and momentum conservation especially for erosion case.

  5. Electrical Conductivity of Charged Particle Systems and Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Röpke, G.

    2018-01-01

    One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.

  6. Statistical strategy for anisotropic adventitia modelling in IVUS.

    PubMed

    Gil, Debora; Hernández, Aura; Rodriguez, Oriol; Mauri, Josepa; Radeva, Petia

    2006-06-01

    Vessel plaque assessment by analysis of intravascular ultrasound sequences is a useful tool for cardiac disease diagnosis and intervention. Manual detection of luminal (inner) and media-adventitia (external) vessel borders is the main activity of physicians in the process of lumen narrowing (plaque) quantification. Difficult definition of vessel border descriptors, as well as, shades, artifacts, and blurred signal response due to ultrasound physical properties trouble automated adventitia segmentation. In order to efficiently approach such a complex problem, we propose blending advanced anisotropic filtering operators and statistical classification techniques into a vessel border modelling strategy. Our systematic statistical analysis shows that the reported adventitia detection achieves an accuracy in the range of interobserver variability regardless of plaque nature, vessel geometry, and incomplete vessel borders.

  7. Transnational Quantum: Quantum Physics in India through the Lens of Satyendranath Bose

    NASA Astrophysics Data System (ADS)

    Banerjee, Somaditya

    2016-08-01

    This paper traces the social and cultural dimensions of quantum physics in colonial India where Satyendranath Bose worked. By focusing on Bose's approach towards the quantum and his collaboration with Albert Einstein, I argue that his physics displayed both the localities of doing science in early twentieth century India as well as a cosmopolitan dimension. He transformed the fundamental new concept of the light quantum developed by Einstein in 1905 within the social and political context of colonial India. This cross-pollination of the local with the global is termed here as the locally rooted cosmopolitan nature of Bose's science. The production of new knowledge through quantum statistics by Bose show the co-constructed nature of physics and the transnational nature of the quantum.

  8. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  9. Statistical Physics Approaches to RNA Editing

    NASA Astrophysics Data System (ADS)

    Bundschuh, Ralf

    2012-02-01

    The central dogma of molecular Biology states that DNA is transcribed base by base into RNA which is in turn translated into proteins. However, some organisms edit their RNA before translation by inserting, deleting, or substituting individual or short stretches of bases. In many instances the mechanisms by which an organism recognizes the positions at which to edit or by which it performs the actual editing are unknown. One model system that stands out by its very high rate of on average one out of 25 bases being edited are the Myxomycetes, a class of slime molds. In this talk we will show how the computational methods and concepts from statistical Physics can be used to analyze DNA and protein sequence data to predict editing sites in these slime molds and to guide experiments that identified previously unknown types of editing as well as the complete set of editing events in the slime mold Physarum polycephalum.

  10. Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu

    1995-11-01

    This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.

  11. Operational health physics training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1992-06-01

    The initial four sections treat basic information concerning atomic structure and other useful physical quantities, natural radioactivity, the properties of {alpha}, {beta}, {gamma}, x rays and neutrons, and the concepts and units of radiation dosimetry (including SI units). Section 5 deals with biological effects and the risks associated with radiation exposure. Background radiation and man-made sources are discussed next. The basic recommendations of the ICRP concerning dose limitations: justification, optimization (ALARA concepts and applications) and dose limits are covered in Section seven. Section eight is an expanded version of shielding, and the internal dosimetry discussion has been extensively revised tomore » reflect the concepts contained in the MIRD methodology and ICRP 30. The remaining sections discuss the operational health physics approach to monitoring radiation. Individual sections include radiation detection principles, instrument operation and counting statistics, health physics instruments and personnel monitoring devices. The last five sections deal with the nature of, operation principles of, health physics aspects of, and monitoring approaches to air sampling, reactors, nuclear safety, gloveboxes and hot cells, accelerators and x ray sources. Decontamination, waste disposal and transportation of radionuclides are added topics. Several appendices containing constants, symbols, selected mathematical topics, and the Chart of the Nuclides, and an index have been included.« less

  12. Assessing tribal youth physical activity and programming using a community-based participatory research approach.

    PubMed

    Perry, Cynthia; Hoffman, Barbara

    2010-01-01

    American Indian youth experience a greater prevalence of obesity compared with the general U.S. population. One avenue to reverse the trend toward increasing obesity prevalence is through promoting physical activity. The goal of this project was to understand tribal youths' current patterns of physical activity behavior and their beliefs and preferences about physical activity. This assessment used a community-based participatory research approach. Sample included 35 Native youth aged 8-18. A Community Advisory Board was created that specifically developed an exercise survey for this assessment to explore physical activity patterns, preferences, and determinants. Twenty-six youth completed the survey. Descriptive statistics were analyzed, exploring differences by age group. Nine youth participated in 2 focus groups. Qualitative data were analyzed with thematic analysis. Youth distinguished between sports and exercise, with each possessing different determinants. Common motivators were friends, coach, and school, and barriers were lack of programs and school or work. None of the youth reported meeting the recommended 60 min of strenuous exercise daily. This tribal academic partnership responded to a tribal concern by developing an exercise survey and conducting focus groups that addressed tribal-specific questions. The results are informing program development.

  13. State-Transition Structures in Physics and in Computation

    NASA Astrophysics Data System (ADS)

    Petri, C. A.

    1982-12-01

    In order to establish close connections between physical and computational processes, it is assumed that the concepts of “state” and of “transition” are acceptable both to physicists and to computer scientists, at least in an informal way. The aim of this paper is to propose formal definitions of state and transition elements on the basis of very low level physical concepts in such a way that (1) all physically possible computations can be described as embedded in physical processes; (2) the computational aspects of physical processes can be described on a well-defined level of abstraction; (3) the gulf between the continuous models of physics and the discrete models of computer science can be bridged by simple mathematical constructs which may be given a physical interpretation; (4) a combinatorial, nonstatistical definition of “information” can be given on low levels of abstraction which may serve as a basis to derive higher-level concepts of information, e.g., by a statistical or probabilistic approach. Conceivable practical consequences are discussed.

  14. Regional projection of climate impact indices over the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Casanueva, Ana; Frías, M.; Dolores; Herrera, Sixto; Bedia, Joaquín; San Martín, Daniel; Gutiérrez, José Manuel; Zaninovic, Ksenija

    2014-05-01

    Climate Impact Indices (CIIs) are being increasingly used in different socioeconomic sectors to transfer information about climate change impacts and risks to stakeholders. CIIs are typically based on different weather variables such as temperature, wind speed, precipitation or humidity and comprise, in a single index, the relevant meteorological information for the particular impact sector (in this study wildfires and tourism). This dependence on several climate variables poses important limitations to the application of statistical downscaling techniques, since physical consistency among variables is required in most cases to obtain reliable local projections. The present study assesses the suitability of the "direct" downscaling approach, in which the downscaling method is directly applied to the CII. In particular, for illustrative purposes, we consider two popular indices used in the wildfire and tourism sectors, the Fire Weather Index (FWI) and the Physiological Equivalent Temperature (PET), respectively. As an example, two case studies are analysed over two representative Mediterranean regions of interest for the EU CLIM-RUN project: continental Spain for the FWI and Croatia for the PET. Results obtained with this "direct" downscaling approach are similar to those found from the application of the statistical downscaling to the individual meteorological drivers prior to the index calculation ("component" downscaling) thus, a wider range of statistical downscaling methods could be used. As an illustration, future changes in both indices are projected by applying two direct statistical downscaling methods, analogs and linear regression, to the ECHAM5 model. Larger differences were found between the two direct statistical downscaling approaches than between the direct and the component approaches with a single downscaling method. While these examples focus on particular indices and Mediterranean regions of interest for CLIM-RUN stakeholders, the same study could be extended to other indices and regions.

  15. Statistics of dislocation pinning at localized obstacles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, A.; Bhattacharya, M., E-mail: mishreyee@vecc.gov.in; Barat, P.

    2014-10-14

    Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning ofmore » dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.« less

  16. Dynamic modelling of n-of-1 data: powerful and flexible data analytics applied to individualised studies.

    PubMed

    Vieira, Rute; McDonald, Suzanne; Araújo-Soares, Vera; Sniehotta, Falko F; Henderson, Robin

    2017-09-01

    N-of-1 studies are based on repeated observations within an individual or unit over time and are acknowledged as an important research method for generating scientific evidence about the health or behaviour of an individual. Statistical analyses of n-of-1 data require accurate modelling of the outcome while accounting for its distribution, time-related trend and error structures (e.g., autocorrelation) as well as reporting readily usable contextualised effect sizes for decision-making. A number of statistical approaches have been documented but no consensus exists on which method is most appropriate for which type of n-of-1 design. We discuss the statistical considerations for analysing n-of-1 studies and briefly review some currently used methodologies. We describe dynamic regression modelling as a flexible and powerful approach, adaptable to different types of outcomes and capable of dealing with the different challenges inherent to n-of-1 statistical modelling. Dynamic modelling borrows ideas from longitudinal and event history methodologies which explicitly incorporate the role of time and the influence of past on future. We also present an illustrative example of the use of dynamic regression on monitoring physical activity during the retirement transition. Dynamic modelling has the potential to expand researchers' access to robust and user-friendly statistical methods for individualised studies.

  17. The influence of HOPE VI neighborhood revitalization on neighborhood-based physical activity: A mixed-methods approach.

    PubMed

    Dulin-Keita, Akilah; Clay, Olivio; Whittaker, Shannon; Hannon, Lonnie; Adams, Ingrid K; Rogers, Michelle; Gans, Kim

    2015-08-01

    This study uses a mixed methods approach to 1) identify surrounding residents' perceived expectations for Housing Opportunities for People Everywhere (HOPE VI) policy on physical activity outcomes and to 2) quantitatively examine the odds of neighborhood-based physical activity pre-/post-HOPE VI in a low socioeconomic status, predominantly African American community in Birmingham, Alabama. To address aim one, we used group concept mapping which is a structured approach for data collection and analyses that produces pictures/maps of ideas. Fifty-eight residents developed statements about potential influences of HOPE VI on neighborhood-based physical activity. In the quantitative study, we examined whether these potential influences increased the odds of neighborhood walking/jogging. We computed block entry logistic regression models with a larger cohort of residents at baseline (n = 184) and six-months (n = 142, 77% retention; n = 120 for all informative variables). We examined perceived neighborhood disorder (perceived neighborhood disorder scale), walkability and aesthetics (Neighborhood Environment Walkability Scale) and HOPE VI-related community safety and safety for physical activity as predictors. During concept mapping, residents generated statements that clustered into three distinct concepts, "Increased Leisure Physical Activity," "Safe Play Areas," and "Generating Health Promoting Resources." The quantitative analyses indicated that changes in neighborhood walkability increased the odds of neighborhood-based physical activity (p = 0.04). When HOPE VI-related safety for physical activity was entered into the model, it was associated with increased odds of physical activity (p = 0.04). Walkability was no longer statistically significant. These results suggest that housing policies that create walkable neighborhoods and that improve perceptions of safety for physical activity may increase neighborhood-based physical activity. However, the longer term impacts of neighborhood-level policies on physical activity require more longitudinal evidence to determine whether increased participation in physical activity is sustained. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Cost-effectiveness of a classification-based system for sub-acute and chronic low back pain.

    PubMed

    Apeldoorn, Adri T; Bosmans, Judith E; Ostelo, Raymond W; de Vet, Henrica C W; van Tulder, Maurits W

    2012-07-01

    Identifying relevant subgroups in patients with low back pain (LBP) is considered important to guide physical therapy practice and to improve outcomes. The aim of the present study was to assess the cost-effectiveness of a modified version of Delitto's classification-based treatment approach compared with usual physical therapy care in patients with sub-acute and chronic LBP with 1 year follow-up. All patients were classified using the modified version of Delitto's classification-based system and then randomly assigned to receive either classification-based treatment or usual physical therapy care. The main clinical outcomes measured were; global perceived effect, intensity of pain, functional disability and quality of life. Costs were measured from a societal perspective. Multiple imputations were used for missing data. Uncertainty surrounding cost differences and incremental cost-effectiveness ratios was estimated using bootstrapping. Cost-effectiveness planes and cost-effectiveness acceptability curves were estimated. In total, 156 patients were included. The outcome analyses showed a significantly better outcome on global perceived effect favoring the classification-based approach, and no differences between the groups on pain, disability and quality-adjusted life-years. Mean total societal costs for the classification-based group were 2,287, and for the usual physical therapy care group 2,020. The difference was 266 (95% CI -720 to 1,612) and not statistically significant. Cost-effectiveness analyses showed that the classification-based approach was not cost-effective in comparison with usual physical therapy care for any clinical outcome measure. The classification-based treatment approach as used in this study was not cost-effective in comparison with usual physical therapy care in a population of patients with sub-acute and chronic LBP.

  19. ``Physical Concepts in Cell Biology,'' an upper level interdisciplinary course in cell biophysics/mathematical biology

    NASA Astrophysics Data System (ADS)

    Vavylonis, Dimitrios

    2009-03-01

    I will describe my experience in developing an interdisciplinary biophysics course addressed to students at the upper undergraduate and graduate level, in collaboration with colleagues in physics and biology. The students had a background in physics, biology and engineering, and for many the course was their first exposure to interdisciplinary topics. The course did not depend on a formal knowledge of equilibrium statistical mechanics. Instead, the approach was based on dynamics. I used diffusion as a universal ``long time'' law to illustrate scaling concepts. The importance of statistics and proper counting of states/paths was introduced by calculating the maximum accuracy with which bacteria can measure the concentration of diffuse chemicals. The use of quantitative concepts and methods was introduced through specific biological examples, focusing on model organisms and extremes at the cell level. Examples included microtubule dynamic instability, the search and capture model, molecular motor cooperativity in muscle cells, mitotic spindle oscillations in C. elegans, polymerization forces and propulsion of pathogenic bacteria, Brownian ratchets, bacterial cell division and MinD oscillations.

  20. Entropy, a Unifying Concept: from Physics to Cognitive Psychology

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino; Tsallis, Alexandra C.

    Together with classical, relativistic and quantum mechanics, as well as Maxwell electromagnetism, Boltzmann-Gibbs (BG) statistical mechanics constitutes one of the main theories of contemporary physics. This theory primarily concerns inanimate matter, and at its generic foundation we find nonlinear dynamical systems satisfying the ergodic hypothesis. This hypothesis is typically guaranteed for systems whose maximal Lyapunov exponent is positive. What happens when this crucial quantity is zero instead? We suggest here that, in what concerns thermostatistical properties, we typically enter what in some sense may be considered as a new world — the world of living systems — . The need emerges, at least for many systems, for generalizing the basis of BG statistical mechanics, namely the Boltzmann-Gibbs-von Neumann-Shannon en-tropic functional form, which connects the oscopic, thermodynamic quantity, with the occurrence probabilities of microscopic configurations. This unifying approach is briefly reviewed here, and its widespread applications — from physics to cognitive psychology — are overviewed. Special attention is dedicated to the learning/memorizing process in humans and computers. The present observations might be related to the gestalt theory of visual perceptions and the actor-network theory.

  1. On the limitations of standard statistical modeling in biological systems: a full Bayesian approach for biology.

    PubMed

    Gomez-Ramirez, Jaime; Sanz, Ricardo

    2013-09-01

    One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Relating triggering processes in lab experiments with earthquakes.

    NASA Astrophysics Data System (ADS)

    Baro Urbea, J.; Davidsen, J.; Kwiatek, G.; Charalampidou, E. M.; Goebel, T.; Stanchits, S. A.; Vives, E.; Dresen, G.

    2016-12-01

    Statistical relations such as Gutenberg-Richter's, Omori-Utsu's and the productivity of aftershocks were first observed in seismology, but are also common to other physical phenomena exhibiting avalanche dynamics such as solar flares, rock fracture, structural phase transitions and even stock market transactions. All these examples exhibit spatio-temporal correlations that can be explained as triggering processes: Instead of being activated as a response to external driving or fluctuations, some events are consequence of previous activity. Although different plausible explanations have been suggested in each system, the ubiquity of such statistical laws remains unknown. However, the case of rock fracture may exhibit a physical connection with seismology. It has been suggested that some features of seismology have a microscopic origin and are reproducible over a vast range of scales. This hypothesis has motivated mechanical experiments to generate artificial catalogues of earthquakes at a laboratory scale -so called labquakes- and under controlled conditions. Microscopic fractures in lab tests release elastic waves that are recorded as ultrasonic (kHz-MHz) acoustic emission (AE) events by means of piezoelectric transducers. Here, we analyse the statistics of labquakes recorded during the failure of small samples of natural rocks and artificial porous materials under different controlled compression regimes. Temporal and spatio-temporal correlations are identified in certain cases. Specifically, we distinguish between the background and triggered events, revealing some differences in the statistical properties. We fit the data to statistical models of seismicity. As a particular case, we explore the branching process approach simplified in the Epidemic Type Aftershock Sequence (ETAS) model. We evaluate the empirical spatio-temporal kernel of the model and investigate the physical origins of triggering. Our analysis of the focal mechanisms implies that the occurrence of the empirical laws extends well beyond purely frictional sliding events, in contrast to what is often assumed.

  3. A statistical approach to combining multisource information in one-class classifiers

    DOE PAGES

    Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.; ...

    2017-06-08

    A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less

  4. A statistical approach to combining multisource information in one-class classifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.

    A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less

  5. Analysis of uncertainties and convergence of the statistical quantities in turbulent wall-bounded flows by means of a physically based criterion

    NASA Astrophysics Data System (ADS)

    Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu

    2018-04-01

    The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.

  6. Dissociation kinetics of metal clusters on multiple electronic states including electronic level statistics into the vibronic soup

    NASA Astrophysics Data System (ADS)

    Shvartsburg, Alexandre A.; Siu, K. W. Michael

    2001-06-01

    Modeling the delayed dissociation of clusters had been over the last decade a frontline development area in chemical physics. It is of fundamental interest how statistical kinetics methods previously validated for regular molecules and atomic nuclei may apply to clusters, as this would help to understand the transferability of statistical models for disintegration of complex systems across various classes of physical objects. From a practical perspective, accurate simulation of unimolecular decomposition is critical for the extraction of true thermochemical values from measurements on the decay of energized clusters. Metal clusters are particularly challenging because of the multitude of low-lying electronic states that are coupled to vibrations. This has previously been accounted for assuming the average electronic structure of a conducting cluster approximated by the levels of electron in a cavity. While this provides a reasonable time-averaged description, it ignores the distribution of instantaneous electronic structures in a "boiling" cluster around that average. Here we set up a new treatment that incorporates the statistical distribution of electronic levels around the average picture using random matrix theory. This approach faithfully reflects the completely chaotic "vibronic soup" nature of hot metal clusters. We found that the consideration of electronic level statistics significantly promotes electronic excitation and thus increases the magnitude of its effect. As this excitation always depresses the decay rates, the inclusion of level statistics results in slower dissociation of metal clusters.

  7. MT+, integrating magnetotellurics to determine earth structure, physical state, and processes

    USGS Publications Warehouse

    Bedrosian, P.A.

    2007-01-01

    As one of the few deep-earth imaging techniques, magnetotellurics provides information on both the structure and physical state of the crust and upper mantle. Magnetotellurics is sensitive to electrical conductivity, which varies within the earth by many orders of magnitude and is modified by a range of earth processes. As with all geophysical techniques, magnetotellurics has a non-unique inverse problem and has limitations in resolution and sensitivity. As such, an integrated approach, either via the joint interpretation of independent geophysical models, or through the simultaneous inversion of independent data sets is valuable, and at times essential to an accurate interpretation. Magnetotelluric data and models are increasingly integrated with geological, geophysical and geochemical information. This review considers recent studies that illustrate the ways in which such information is combined, from qualitative comparisons to statistical correlation studies to multi-property inversions. Also emphasized are the range of problems addressed by these integrated approaches, and their value in elucidating earth structure, physical state, and processes. ?? Springer Science+Business Media B.V. 2007.

  8. On the statistical properties and tail risk of violent conflicts

    NASA Astrophysics Data System (ADS)

    Cirillo, Pasquale; Taleb, Nassim Nicholas

    2016-06-01

    We examine statistical pictures of violent conflicts over the last 2000 years, providing techniques for dealing with the unreliability of historical data. We make use of a novel approach to deal with fat-tailed random variables with a remote but nonetheless finite upper bound, by defining a corresponding unbounded dual distribution (given that potential war casualties are bounded by the world population). This approach can also be applied to other fields of science where power laws play a role in modeling, like geology, hydrology, statistical physics and finance. We apply methods from extreme value theory on the dual distribution and derive its tail properties. The dual method allows us to calculate the real tail mean of war casualties, which proves to be considerably larger than the corresponding sample mean for large thresholds, meaning severe underestimation of the tail risks of conflicts from naive observation. We analyze the robustness of our results to errors in historical reports. We study inter-arrival times between tail events and find that no particular trend can be asserted. All the statistical pictures obtained are at variance with the prevailing claims about ;long peace;, namely that violence has been declining over time.

  9. Robust hypothesis tests for detecting statistical evidence of two-dimensional and three-dimensional interactions in single-molecule measurements

    NASA Astrophysics Data System (ADS)

    Calderon, Christopher P.; Weiss, Lucien E.; Moerner, W. E.

    2014-05-01

    Experimental advances have improved the two- (2D) and three-dimensional (3D) spatial resolution that can be extracted from in vivo single-molecule measurements. This enables researchers to quantitatively infer the magnitude and directionality of forces experienced by biomolecules in their native environment. Situations where such force information is relevant range from mitosis to directed transport of protein cargo along cytoskeletal structures. Models commonly applied to quantify single-molecule dynamics assume that effective forces and velocity in the x ,y (or x ,y,z) directions are statistically independent, but this assumption is physically unrealistic in many situations. We present a hypothesis testing approach capable of determining if there is evidence of statistical dependence between positional coordinates in experimentally measured trajectories; if the hypothesis of independence between spatial coordinates is rejected, then a new model accounting for 2D (3D) interactions can and should be considered. Our hypothesis testing technique is robust, meaning it can detect interactions, even if the noise statistics are not well captured by the model. The approach is demonstrated on control simulations and on experimental data (directed transport of intraflagellar transport protein 88 homolog in the primary cilium).

  10. Do employers reward physical attractiveness in transition countries?

    PubMed

    Mavisakalyan, Astghik

    2018-02-01

    This paper studies the labour market returns to physical attractiveness using data from three transition countries of the Caucasus: Armenia, Azerbaijan and Georgia. I estimate a large positive effect of attractive looks on males' probability of employment. Results from the most comprehensive model suggest a marginal effect of 11.1 percentage points. Using a partial identification approach, I show that this relationship is likely to be causal. After accounting for covariates, particularly measures of human capital, there is no evidence for a statistically significant link between females' attractiveness and employment. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Explaining nitrate pollution pressure on the groundwater resource in Kinshasa using a multivariate statistical modelling approach

    NASA Astrophysics Data System (ADS)

    Mfumu Kihumba, Antoine; Vanclooster, Marnik

    2013-04-01

    Drinking water in Kinshasa, the capital of the Democratic Republic of Congo, is provided by extracting groundwater from the local aquifer, particularly in peripheral areas. The exploited groundwater body is mainly unconfined and located within a continuous detrital aquifer, primarily composed of sedimentary formations. However, the aquifer is subjected to an increasing threat of anthropogenic pollution pressure. Understanding the detailed origin of this pollution pressure is important for sustainable drinking water management in Kinshasa. The present study aims to explain the observed nitrate pollution problem, nitrate being considered as a good tracer for other pollution threats. The analysis is made in terms of physical attributes that are readily available using a statistical modelling approach. For the nitrate data, use was made of a historical groundwater quality assessment study, for which the data were re-analysed. The physical attributes are related to the topography, land use, geology and hydrogeology of the region. Prior to the statistical modelling, intrinsic and specific vulnerability for nitrate pollution was assessed. This vulnerability assessment showed that the alluvium area in the northern part of the region is the most vulnerable area. This area consists of urban land use with poor sanitation. Re-analysis of the nitrate pollution data demonstrated that the spatial variability of nitrate concentrations in the groundwater body is high, and coherent with the fragmented land use of the region and the intrinsic and specific vulnerability maps. For the statistical modeling use was made of multiple regression and regression tree analysis. The results demonstrated the significant impact of land use variables on the Kinshasa groundwater nitrate pollution and the need for a detailed delineation of groundwater capture zones around the monitoring stations. Key words: Groundwater , Isotopic, Kinshasa, Modelling, Pollution, Physico-chemical.

  12. A Statistical Graphical Model of the California Reservoir System

    NASA Astrophysics Data System (ADS)

    Taeb, A.; Reager, J. T.; Turmon, M.; Chandrasekaran, V.

    2017-11-01

    The recent California drought has highlighted the potential vulnerability of the state's water management infrastructure to multiyear dry intervals. Due to the high complexity of the network, dynamic storage changes in California reservoirs on a state-wide scale have previously been difficult to model using either traditional statistical or physical approaches. Indeed, although there is a significant line of research on exploring models for single (or a small number of) reservoirs, these approaches are not amenable to a system-wide modeling of the California reservoir network due to the spatial and hydrological heterogeneities of the system. In this work, we develop a state-wide statistical graphical model to characterize the dependencies among a collection of 55 major California reservoirs across the state; this model is defined with respect to a graph in which the nodes index reservoirs and the edges specify the relationships or dependencies between reservoirs. We obtain and validate this model in a data-driven manner based on reservoir volumes over the period 2003-2016. A key feature of our framework is a quantification of the effects of external phenomena that influence the entire reservoir network. We further characterize the degree to which physical factors (e.g., state-wide Palmer Drought Severity Index (PDSI), average temperature, snow pack) and economic factors (e.g., consumer price index, number of agricultural workers) explain these external influences. As a consequence of this analysis, we obtain a system-wide health diagnosis of the reservoir network as a function of PDSI.

  13. The Standard Model in the history of the Natural Sciences, Econometrics, and the social sciences

    NASA Astrophysics Data System (ADS)

    Fisher, W. P., Jr.

    2010-07-01

    In the late 18th and early 19th centuries, scientists appropriated Newton's laws of motion as a model for the conduct of any other field of investigation that would purport to be a science. This early form of a Standard Model eventually informed the basis of analogies for the mathematical expression of phenomena previously studied qualitatively, such as cohesion, affinity, heat, light, electricity, and magnetism. James Clerk Maxwell is known for his repeated use of a formalized version of this method of analogy in lectures, teaching, and the design of experiments. Economists transferring skills learned in physics made use of the Standard Model, especially after Maxwell demonstrated the value of conceiving it in abstract mathematics instead of as a concrete and literal mechanical analogy. Haavelmo's probability approach in econometrics and R. Fisher's Statistical Methods for Research Workers brought a statistical approach to bear on the Standard Model, quietly reversing the perspective of economics and the social sciences relative to that of physics. Where physicists, and Maxwell in particular, intuited scientific method as imposing stringent demands on the quality and interrelations of data, instruments, and theory in the name of inferential and comparative stability, statistical models and methods disconnected theory from data by removing the instrument as an essential component. New possibilities for reconnecting economics and the social sciences to Maxwell's sense of the method of analogy are found in Rasch's probabilistic models for measurement.

  14. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  15. Does subtype matter? Assessing the effects of maltreatment on functioning in preadolescent youth in out-of-home care

    PubMed Central

    Petrenko, Christie L. M.; Friend, Angela; Garrido, Edward F.; Taussig, Heather N.; Culhane, Sara E.

    2012-01-01

    Objectives Attempts to understand the effects of maltreatment subtypes on childhood functioning are complicated by the fact that children often experience multiple subtypes. This study assessed the effects of maltreatment subtypes on the cognitive, academic, and mental health functioning of preadolescent youth in out-of-home care using both “variable-centered” and “person-centered” statistical analytic approaches to modeling multiple subtypes of maltreatment. Methods Participants included 334 preadolescent youth (ages 9 to 11) placed in out-of-home care due to maltreatment. The occurrence and severity of maltreatment subtypes (physical abuse, sexual abuse, physical neglect, and supervisory neglect) were coded from child welfare records. The relationships between maltreatment subtypes and children’s cognitive, academic, and mental health functioning were evaluated with the following approaches: “Variable-centered” analytic methods: Regression approach: Multiple regression was used to estimate the effects of each maltreatment subtype (separate analyses for occurrence and severity), controlling for the other subtypes. Hierarchical approach: Contrast coding was used in regression analyses to estimate the effects of discrete maltreatment categories that were assigned based on a subtype occurrence hierarchy (sexual abuse > physical abuse > physical neglect > supervisory neglect). “Person-centered” analytic method: Latent class analysis was used to group children with similar maltreatment severity profiles into discrete classes. The classes were then compared to determine if they differed in terms of their ability to predict functioning. Results The approaches identified similar relationships between maltreatment subtypes and children’s functioning. The most consistent findings indicated that maltreated children who experienced physical or sexual abuse were at highest risk for caregiver-reported externalizing behavior problems, and those who experienced physical abuse and/or physical neglect were more likely to have higher levels of caregiver-reported internalizing problems. Children experiencing predominantly low severity supervisory neglect had relatively better functioning than other maltreated youth. Conclusions Many of the maltreatment subtype differences identified within the maltreated sample in the current study are consistent with those from previous research comparing maltreated youth to non-maltreated comparison groups. Results do not support combining supervisory and physical neglect. The “variable-centered” and “person-centered” analytic approaches produced complementary results. Advantages and disadvantages of each approach are discussed. PMID:22947490

  16. Physics of Cell Adhesion Failure and Human Diseases

    NASA Astrophysics Data System (ADS)

    Family, Fereydoon

    Emergent phenomena in living systems, including your ability to read these lines, do not obviously follow as a consequence of the fundamental laws of physics. Understanding the physics of living systems clearly falls outside the conventional boundaries of scientific disciplines and requires a collaborative, multidisciplinary approach. Here I will discuss how theoretical and computational techniques from statistical physics can be used to make progress in explaining the physical mechanisms that underlie complex biological phenomena, including major diseases. In the specific cases of macular degeneration and cancer that we have studied recently, we find that the breakdown of the mechanical stability in the local tissue structure caused by weakening of the cell-cell adhesion plays a key role in the initiation and progression of the disease. This finding can help in the development of new therapies that would prevent or halt the initiation and progression of these diseases.

  17. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  18. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  19. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less

  20. A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.

    PubMed

    Sinitskiy, Anton V; Voth, Gregory A

    2015-09-07

    Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.

  1. Full data acquisition in Kelvin Probe Force Microscopy: Mapping dynamic electric phenomena in real space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balke, Nina; Kalinin, Sergei V.; Jesse, Stephen

    Kelvin probe force microscopy (KPFM) has provided deep insights into the role local electronic, ionic and electrochemical processes play on the global functionality of materials and devices, even down to the atomic scale. Conventional KPFM utilizes heterodyne detection and bias feedback to measure the contact potential difference (CPD) between tip and sample. This measurement paradigm, however, permits only partial recovery of the information encoded in bias- and time-dependent electrostatic interactions between the tip and sample and effectively down-samples the cantilever response to a single measurement of CPD per pixel. This level of detail is insufficient for electroactive materials, devices, ormore » solid-liquid interfaces, where non-linear dielectrics are present or spurious electrostatic events are possible. Here, we simulate and experimentally validate a novel approach for spatially resolved KPFM capable of a full information transfer of the dynamic electric processes occurring between tip and sample. General acquisition mode, or G-Mode, adopts a big data approach utilising high speed detection, compression, and storage of the raw cantilever deflection signal in its entirety at high sampling rates (> 4 MHz), providing a permanent record of the tip trajectory. We develop a range of methodologies for analysing the resultant large multidimensional datasets involving classical, physics-based and information-based approaches. Physics-based analysis of G-Mode KPFM data recovers the parabolic bias dependence of the electrostatic force for each cycle of the excitation voltage, leading to a multidimensional dataset containing spatial and temporal dependence of the CPD and capacitance channels. We use multivariate statistical methods to reduce data volume and separate the complex multidimensional data sets into statistically significant components that can then be mapped onto separate physical mechanisms. Overall, G-Mode KPFM offers a new paradigm to study dynamic electric phenomena in electroactive interfaces as well as offer a promising approach to extend KPFM to solid-liquid interfaces.« less

  2. Full data acquisition in Kelvin Probe Force Microscopy: Mapping dynamic electric phenomena in real space

    DOE PAGES

    Balke, Nina; Kalinin, Sergei V.; Jesse, Stephen; ...

    2016-08-12

    Kelvin probe force microscopy (KPFM) has provided deep insights into the role local electronic, ionic and electrochemical processes play on the global functionality of materials and devices, even down to the atomic scale. Conventional KPFM utilizes heterodyne detection and bias feedback to measure the contact potential difference (CPD) between tip and sample. This measurement paradigm, however, permits only partial recovery of the information encoded in bias- and time-dependent electrostatic interactions between the tip and sample and effectively down-samples the cantilever response to a single measurement of CPD per pixel. This level of detail is insufficient for electroactive materials, devices, ormore » solid-liquid interfaces, where non-linear dielectrics are present or spurious electrostatic events are possible. Here, we simulate and experimentally validate a novel approach for spatially resolved KPFM capable of a full information transfer of the dynamic electric processes occurring between tip and sample. General acquisition mode, or G-Mode, adopts a big data approach utilising high speed detection, compression, and storage of the raw cantilever deflection signal in its entirety at high sampling rates (> 4 MHz), providing a permanent record of the tip trajectory. We develop a range of methodologies for analysing the resultant large multidimensional datasets involving classical, physics-based and information-based approaches. Physics-based analysis of G-Mode KPFM data recovers the parabolic bias dependence of the electrostatic force for each cycle of the excitation voltage, leading to a multidimensional dataset containing spatial and temporal dependence of the CPD and capacitance channels. We use multivariate statistical methods to reduce data volume and separate the complex multidimensional data sets into statistically significant components that can then be mapped onto separate physical mechanisms. Overall, G-Mode KPFM offers a new paradigm to study dynamic electric phenomena in electroactive interfaces as well as offer a promising approach to extend KPFM to solid-liquid interfaces.« less

  3. Territorial Developments Based on Graffiti: a Statistical Mechanics Approach

    DTIC Science & Technology

    2011-10-28

    defined on a lattice . We introduce a two-gang Hamiltonian model where agents have red or blue affiliation but are otherwise indistinguishable. In this...ramifications of our results. Keywords: Territorial Formation, Spin Systems, Phase Transitions 1. Introduction Lattice models have been extensively used in...inconsequential. In short, lattice models have proved extremely useful in the context of the physical, biological and even chemical sciences. In more

  4. Persistence of discrimination: Revisiting Axtell, Epstein and Young

    NASA Astrophysics Data System (ADS)

    Weisbuch, Gérard

    2018-02-01

    We reformulate an earlier model of the "Emergence of classes..." proposed by Axtell et al. (2001) using more elaborate cognitive processes allowing a statistical physics approach. The thorough analysis of the phase space and of the basins of attraction leads to a reconsideration of the previous social interpretations: our model predicts the reinforcement of discrimination biases and their long term stability rather than the emergence of classes.

  5. A bibliometric analysis of statistical terms used in American Physical Therapy Association journals (2011-2012): evidence for educating physical therapists.

    PubMed

    Tilson, Julie K; Marshall, Katie; Tam, Jodi J; Fetters, Linda

    2016-04-22

    A primary barrier to the implementation of evidence based practice (EBP) in physical therapy is therapists' limited ability to understand and interpret statistics. Physical therapists demonstrate limited skills and report low self-efficacy for interpreting results of statistical procedures. While standards for physical therapist education include statistics, little empirical evidence is available to inform what should constitute such curricula. The purpose of this study was to conduct a census of the statistical terms and study designs used in physical therapy literature and to use the results to make recommendations for curricular development in physical therapist education. We conducted a bibliometric analysis of 14 peer-reviewed journals associated with the American Physical Therapy Association over 12 months (Oct 2011-Sept 2012). Trained raters recorded every statistical term appearing in identified systematic reviews, primary research reports, and case series and case reports. Investigator-reported study design was also recorded. Terms representing the same statistical test or concept were combined into a single, representative term. Cumulative percentage was used to identify the most common representative statistical terms. Common representative terms were organized into eight categories to inform curricular design. Of 485 articles reviewed, 391 met the inclusion criteria. These 391 articles used 532 different terms which were combined into 321 representative terms; 13.1 (sd = 8.0) terms per article. Eighty-one representative terms constituted 90% of all representative term occurrences. Of the remaining 240 representative terms, 105 (44%) were used in only one article. The most common study design was prospective cohort (32.5%). Physical therapy literature contains a large number of statistical terms and concepts for readers to navigate. However, in the year sampled, 81 representative terms accounted for 90% of all occurrences. These "common representative terms" can be used to inform curricula to promote physical therapists' skills, competency, and confidence in interpreting statistics in their professional literature. We make specific recommendations for curriculum development informed by our findings.

  6. A data-driven and physics-based single-pass retrieval of active-passive microwave covariation and vegetation parameters for the SMAP mission

    NASA Astrophysics Data System (ADS)

    Entekhabi, D.; Jagdhuber, T.; Das, N. N.; Baur, M.; Link, M.; Piles, M.; Akbar, R.; Konings, A. G.; Mccoll, K. A.; Alemohammad, S. H.; Montzka, C.; Kunstmann, H.

    2016-12-01

    The active-passive soil moisture retrieval algorithm of NASA's SMAP mission depends on robust statistical estimation of active-passive covariation (β) and vegetation structure (Γ) parameters in order to provide reliable global measurements of soil moisture on an intermediate level (9km) compared to the native resolution of the radiometer (36km) and radar (3km) instruments. These parameters apply to the SMAP radiometer-radar combination over the period of record that was cut short with the end of the SMAP radar transmission. They also apply to the current SMAP radiometer and Sentinel 1A/B radar combination for high-resolution surface soil moisture mapping. However, the performance of the statistically-based approach is directly dependent on the selection of a representative time frame in which these parameters can be estimated assuming dynamic soil moisture and stationary soil roughness and vegetation cover. Here, we propose a novel, data-driven and physics-based single-pass retrieval of active-passive microwave covariation and vegetation parameters for the SMAP mission. The algorithm does not depend on time series analyses and can be applied using minimum one pair of an active-passive acquisition. The algorithm stems from the physical link between microwave emission and scattering via conservation of energy. The formulation of the emission radiative transfer is combined with the Distorted Born Approximation of radar scattering for vegetated land surfaces. The two formulations are simultaneously solved for the covariation and vegetation structure parameters. Preliminary results from SMAP active-passive observations (April 13th to July 7th 2015) compare well with the time-series statistical approach and confirms the capability of this method to estimate these parameters. Moreover, the method is not restricted to a given frequency (applies to both L-band and C-band combinations for the radar) or incidence angle (all angles and not just the fixed 40° incidence). Therefore, the approach is applicable to the combination of SMAP and Sentinel-1A/B data for active-passive and high-resolution soil moisture estimation.

  7. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less

  8. Loop series for discrete statistical models on graphs

    NASA Astrophysics Data System (ADS)

    Chertkov, Michael; Chernyak, Vladimir Y.

    2006-06-01

    In this paper we present the derivation details, logic, and motivation for the three loop calculus introduced in Chertkov and Chernyak (2006 Phys. Rev. E 73 065102(R)). Generating functions for each of the three interrelated discrete statistical models are expressed in terms of a finite series. The first term in the series corresponds to the Bethe-Peierls belief-propagation (BP) contribution; the other terms are labelled by loops on the factor graph. All loop contributions are simple rational functions of spin correlation functions calculated within the BP approach. We discuss two alternative derivations of the loop series. One approach implements a set of local auxiliary integrations over continuous fields with the BP contribution corresponding to an integrand saddle-point value. The integrals are replaced by sums in the complementary approach, briefly explained in Chertkov and Chernyak (2006 Phys. Rev. E 73 065102(R)). Local gauge symmetry transformations that clarify an important invariant feature of the BP solution are revealed in both approaches. The individual terms change under the gauge transformation while the partition function remains invariant. The requirement for all individual terms to be nonzero only for closed loops in the factor graph (as opposed to paths with loose ends) is equivalent to fixing the first term in the series to be exactly equal to the BP contribution. Further applications of the loop calculus to problems in statistical physics, computer and information sciences are discussed.

  9. Comparison of Neural Network and Linear Regression Models in Statistically Predicting Mental and Physical Health Status of Breast Cancer Survivors

    DTIC Science & Technology

    2015-07-15

    Long-term effects on cancer survivors’ quality of life of physical training versus physical training combined with cognitive-behavioral therapy ...COMPARISON OF NEURAL NETWORK AND LINEAR REGRESSION MODELS IN STATISTICALLY PREDICTING MENTAL AND PHYSICAL HEALTH STATUS OF BREAST...34Comparison of Neural Network and Linear Regression Models in Statistically Predicting Mental and Physical Health Status of Breast Cancer Survivors

  10. Stochastic Approaches Within a High Resolution Rapid Refresh Ensemble

    NASA Astrophysics Data System (ADS)

    Jankov, I.

    2017-12-01

    It is well known that global and regional numerical weather prediction (NWP) ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system is the use of stochastic physics. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), and Stochastic Perturbation of Physics Tendencies (SPPT). The focus of this study is to assess model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) using a variety of stochastic approaches. A single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model was utilized and ensemble members produced by employing stochastic methods. Parameter perturbations (using SPP) for select fields were employed in the Rapid Update Cycle (RUC) land surface model (LSM) and Mellor-Yamada-Nakanishi-Niino (MYNN) Planetary Boundary Layer (PBL) schemes. Within MYNN, SPP was applied to sub-grid cloud fraction, mixing length, roughness length, mass fluxes and Prandtl number. In the RUC LSM, SPP was applied to hydraulic conductivity and tested perturbing soil moisture at initial time. First iterative testing was conducted to assess the initial performance of several configuration settings (e.g. variety of spatial and temporal de-correlation lengths). Upon selection of the most promising candidate configurations using SPP, a 10-day time period was run and more robust statistics were gathered. SKEB and SPPT were included in additional retrospective tests to assess the impact of using all three stochastic approaches to address model uncertainty. Results from the stochastic perturbation testing were compared to a baseline multi-physics control ensemble. For probabilistic forecast performance the Model Evaluation Tools (MET) verification package was used.

  11. A spatial epidemiological analysis of self-rated mental health in the slums of Dhaka

    PubMed Central

    2011-01-01

    Background The deprived physical environments present in slums are well-known to have adverse health effects on their residents. However, little is known about the health effects of the social environments in slums. Moreover, neighbourhood quantitative spatial analyses of the mental health status of slum residents are still rare. The aim of this paper is to study self-rated mental health data in several slums of Dhaka, Bangladesh, by accounting for neighbourhood social and physical associations using spatial statistics. We hypothesised that mental health would show a significant spatial pattern in different population groups, and that the spatial patterns would relate to spatially-correlated health-determining factors (HDF). Methods We applied a spatial epidemiological approach, including non-spatial ANOVA/ANCOVA, as well as global and local univariate and bivariate Moran's I statistics. The WHO-5 Well-being Index was used as a measure of self-rated mental health. Results We found that poor mental health (WHO-5 scores < 13) among the adult population (age ≥15) was prevalent in all slum settlements. We detected spatially autocorrelated WHO-5 scores (i.e., spatial clusters of poor and good mental health among different population groups). Further, we detected spatial associations between mental health and housing quality, sanitation, income generation, environmental health knowledge, education, age, gender, flood non-affectedness, and selected properties of the natural environment. Conclusions Spatial patterns of mental health were detected and could be partly explained by spatially correlated HDF. We thereby showed that the socio-physical neighbourhood was significantly associated with health status, i.e., mental health at one location was spatially dependent on the mental health and HDF prevalent at neighbouring locations. Furthermore, the spatial patterns point to severe health disparities both within and between the slums. In addition to examining health outcomes, the methodology used here is also applicable to residuals of regression models, such as helping to avoid violating the assumption of data independence that underlies many statistical approaches. We assume that similar spatial structures can be found in other studies focussing on neighbourhood effects on health, and therefore argue for a more widespread incorporation of spatial statistics in epidemiological studies. PMID:21599932

  12. A scaling procedure for the response of an isolated system with high modal overlap factor

    NASA Astrophysics Data System (ADS)

    De Rosa, S.; Franco, F.

    2008-10-01

    The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).

  13. A pedagogical approach to the Boltzmann factor through experiments and simulations

    NASA Astrophysics Data System (ADS)

    Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.

    2009-09-01

    The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to see and is not at the level of high school or college students' preparation. We here present some experiments and simulations aimed at directly deriving its mathematical expression and illustrating the fundamental concepts on which it is grounded. Experiments use easily available apparatuses, and simulations are developed in the Net-Logo environment that, besides having a user-friendly interface, allows an easy interaction with the algorithm. The approach supplies pedagogical support for the introduction of the Boltzmann factor at the undergraduate level to students without a background in statistical mechanics.

  14. Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems

    DOE PAGES

    Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia; ...

    2017-09-05

    Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less

  15. Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia

    Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less

  16. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  17. Remote sensing-aided systems for snow qualification, evapotranspiration estimation, and their application in hydrologic models

    NASA Technical Reports Server (NTRS)

    Korram, S.

    1977-01-01

    The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.

  18. Maximum one-shot dissipated work from Rényi divergences

    NASA Astrophysics Data System (ADS)

    Yunger Halpern, Nicole; Garner, Andrew J. P.; Dahlsten, Oscar C. O.; Vedral, Vlatko

    2018-05-01

    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.

  19. Maximum one-shot dissipated work from Rényi divergences.

    PubMed

    Yunger Halpern, Nicole; Garner, Andrew J P; Dahlsten, Oscar C O; Vedral, Vlatko

    2018-05-01

    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Rényi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.

  20. deltaGseg: macrostate estimation via molecular dynamics simulations and multiscale time series analysis.

    PubMed

    Low, Diana H P; Motakis, Efthymios

    2013-10-01

    Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.

  1. What can we learn from noise? — Mesoscopic nonequilibrium statistical physics —

    PubMed Central

    KOBAYASHI, Kensuke

    2016-01-01

    Mesoscopic systems — small electric circuits working in quantum regime — offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics. PMID:27477456

  2. What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.

    PubMed

    Kobayashi, Kensuke

    2016-01-01

    Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.

  3. Physical fitness modulates incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.

    PubMed

    Daikoku, Tatsuya; Takahashi, Yuji; Futagami, Hiroko; Tarumoto, Nagayoshi; Yasuda, Hideki

    2017-02-01

    In real-world auditory environments, humans are exposed to overlapping auditory information such as those made by human voices and musical instruments even during routine physical activities such as walking and cycling. The present study investigated how concurrent physical exercise affects performance of incidental and intentional learning of overlapping auditory streams, and whether physical fitness modulates the performances of learning. Participants were grouped with 11 participants with lower and higher fitness each, based on their Vo 2 max value. They were presented simultaneous auditory sequences with a distinct statistical regularity each other (i.e. statistical learning), while they were pedaling on the bike and seating on a bike at rest. In experiment 1, they were instructed to attend to one of the two sequences and ignore to the other sequence. In experiment 2, they were instructed to attend to both of the two sequences. After exposure to the sequences, learning effects were evaluated by familiarity test. In the experiment 1, performance of statistical learning of ignored sequences during concurrent pedaling could be higher in the participants with high than low physical fitness, whereas in attended sequence, there was no significant difference in performance of statistical learning between high than low physical fitness. Furthermore, there was no significant effect of physical fitness on learning while resting. In the experiment 2, the both participants with high and low physical fitness could perform intentional statistical learning of two simultaneous sequences in the both exercise and rest sessions. The improvement in physical fitness might facilitate incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.

  4. How to assess the impact of a physical parameterization in simulations of moist convection?

    NASA Astrophysics Data System (ADS)

    Grabowski, Wojciech

    2017-04-01

    A numerical model capable in simulating moist convection (e.g., cloud-resolving model or large-eddy simulation model) consists of a fluid flow solver combined with required representations (i.e., parameterizations) of physical processes. The later typically include cloud microphysics, radiative transfer, and unresolved turbulent transport. Traditional approaches to investigate impacts of such parameterizations on convective dynamics involve parallel simulations with different parameterization schemes or with different scheme parameters. Such methodologies are not reliable because of the natural variability of a cloud field that is affected by the feedback between the physics and dynamics. For instance, changing the cloud microphysics typically leads to a different realization of the cloud-scale flow, and separating dynamical and microphysical impacts is difficult. This presentation will present a novel modeling methodology, the piggybacking, that allows studying the impact of a physical parameterization on cloud dynamics with confidence. The focus will be on the impact of cloud microphysics parameterization. Specific examples of the piggybacking approach will include simulations concerning the hypothesized deep convection invigoration in polluted environments, the validity of the saturation adjustment in modeling condensation in moist convection, and separation of physical impacts from statistical uncertainty in simulations applying particle-based Lagrangian microphysics, the super-droplet method.

  5. On-line integration of computer controlled diagnostic devices and medical information systems in undergraduate medical physics education for physicians.

    PubMed

    Hanus, Josef; Nosek, Tomas; Zahora, Jiri; Bezrouk, Ales; Masin, Vladimir

    2013-01-01

    We designed and evaluated an innovative computer-aided-learning environment based on the on-line integration of computer controlled medical diagnostic devices and a medical information system for use in the preclinical medical physics education of medical students. Our learning system simulates the actual clinical environment in a hospital or primary care unit. It uses a commercial medical information system for on-line storage and processing of clinical type data acquired during physics laboratory classes. Every student adopts two roles, the role of 'patient' and the role of 'physician'. As a 'physician' the student operates the medical devices to clinically assess 'patient' colleagues and records all results in an electronic 'patient' record. We also introduced an innovative approach to the use of supportive education materials, based on the methods of adaptive e-learning. A survey of student feedback is included and statistically evaluated. The results from the student feedback confirm the positive response of the latter to this novel implementation of medical physics and informatics in preclinical education. This approach not only significantly improves learning of medical physics and informatics skills but has the added advantage that it facilitates students' transition from preclinical to clinical subjects. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. PV cells electrical parameters measurement

    NASA Astrophysics Data System (ADS)

    Cibira, Gabriel

    2017-12-01

    When measuring optical parameters of a photovoltaic silicon cell, precise results bring good electrical parameters estimation, applying well-known physical-mathematical models. Nevertheless, considerable re-combination phenomena might occur in both surface and intrinsic thin layers within novel materials. Moreover, rear contact surface parameters may influence close-area re-combination phenomena, too. Therefore, the only precise electrical measurement approach is to prove assumed cell electrical parameters. Based on theoretical approach with respect to experiments, this paper analyses problems within measurement procedures and equipment used for electrical parameters acquisition within a photovoltaic silicon cell, as a case study. Statistical appraisal quality is contributed.

  7. The Wang-Landau Sampling Algorithm

    NASA Astrophysics Data System (ADS)

    Landau, David P.

    2003-03-01

    Over the past several decades Monte Carlo simulations[1] have evolved into a powerful tool for the study of wide-ranging problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, usually in the canonical ensemble, and enormous improvements have been made in performance through the implementation of novel algorithms. Nonetheless, difficulties arise near phase transitions, either due to critical slowing down near 2nd order transitions or to metastability near 1st order transitions, thus limiting the applicability of the method. We shall describe a new and different Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is estimated, all thermodynamic properties can be calculated at all temperatures. This approach can be extended to multi-dimensional parameter spaces and has already found use in classical models of interacting particles including systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc., as well as for quantum models. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).

  8. Statistical physics and physiology: monofractal and multifractal approaches

    NASA Technical Reports Server (NTRS)

    Stanley, H. E.; Amaral, L. A.; Goldberger, A. L.; Havlin, S.; Peng, C. K.

    1999-01-01

    Even under healthy, basal conditions, physiologic systems show erratic fluctuations resembling those found in dynamical systems driven away from a single equilibrium state. Do such "nonequilibrium" fluctuations simply reflect the fact that physiologic systems are being constantly perturbed by external and intrinsic noise? Or, do these fluctuations actually, contain useful, "hidden" information about the underlying nonequilibrium control mechanisms? We report some recent attempts to understand the dynamics of complex physiologic fluctuations by adapting and extending concepts and methods developed very recently in statistical physics. Specifically, we focus on interbeat interval variability as an important quantity to help elucidate possibly non-homeostatic physiologic variability because (i) the heart rate is under direct neuroautonomic control, (ii) interbeat interval variability is readily measured by noninvasive means, and (iii) analysis of these heart rate dynamics may provide important practical diagnostic and prognostic information not obtainable with current approaches. The analytic tools we discuss may be used on a wider range of physiologic signals. We first review recent progress using two analysis methods--detrended fluctuation analysis and wavelets--sufficient for quantifying monofractual structures. We then describe recent work that quantifies multifractal features of interbeat interval series, and the discovery that the multifractal structure of healthy subjects is different than that of diseased subjects.

  9. Compositional data analysis for physical activity, sedentary time and sleep research.

    PubMed

    Dumuid, Dorothea; Stanford, Tyman E; Martin-Fernández, Josep-Antoni; Pedišić, Željko; Maher, Carol A; Lewis, Lucy K; Hron, Karel; Katzmarzyk, Peter T; Chaput, Jean-Philippe; Fogelholm, Mikael; Hu, Gang; Lambert, Estelle V; Maia, José; Sarmiento, Olga L; Standage, Martyn; Barreira, Tiago V; Broyles, Stephanie T; Tudor-Locke, Catrine; Tremblay, Mark S; Olds, Timothy

    2017-01-01

    The health effects of daily activity behaviours (physical activity, sedentary time and sleep) are widely studied. While previous research has largely examined activity behaviours in isolation, recent studies have adjusted for multiple behaviours. However, the inclusion of all activity behaviours in traditional multivariate analyses has not been possible due to the perfect multicollinearity of 24-h time budget data. The ensuing lack of adjustment for known effects on the outcome undermines the validity of study findings. We describe a statistical approach that enables the inclusion of all daily activity behaviours, based on the principles of compositional data analysis. Using data from the International Study of Childhood Obesity, Lifestyle and the Environment, we demonstrate the application of compositional multiple linear regression to estimate adiposity from children's daily activity behaviours expressed as isometric log-ratio coordinates. We present a novel method for predicting change in a continuous outcome based on relative changes within a composition, and for calculating associated confidence intervals to allow for statistical inference. The compositional data analysis presented overcomes the lack of adjustment that has plagued traditional statistical methods in the field, and provides robust and reliable insights into the health effects of daily activity behaviours.

  10. A study of correlations between crude oil spot and futures markets: A rolling sample test

    NASA Astrophysics Data System (ADS)

    Liu, Li; Wan, Jieqiu

    2011-10-01

    In this article, we investigate the asymmetries of exceedance correlations and cross-correlations between West Texas Intermediate (WTI) spot and futures markets. First, employing the test statistic proposed by Hong et al. [Asymmetries in stock returns: statistical tests and economic evaluation, Review of Financial Studies 20 (2007) 1547-1581], we find that the exceedance correlations were overall symmetric. However, the results from rolling windows show that some occasional events could induce the significant asymmetries of the exceedance correlations. Second, employing the test statistic proposed by Podobnik et al. [Quantifying cross-correlations using local and global detrending approaches, European Physics Journal B 71 (2009) 243-250], we find that the cross-correlations were significant even for large lagged orders. Using the detrended cross-correlation analysis proposed by Podobnik and Stanley [Detrended cross-correlation analysis: a new method for analyzing two nonstationary time series, Physics Review Letters 100 (2008) 084102], we find that the cross-correlations were weakly persistent and were stronger between spot and futures contract with larger maturity. Our results from rolling sample test also show the apparent effects of the exogenous events. Additionally, we have some relevant discussions on the obtained evidence.

  11. Worldwide seismicity in view of non-extensive statistical physics

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kaliopi; Vallianatos, Filippos; Michas, George

    2014-05-01

    In the present work we study the distribution of worldwide shallow seismic events occurred from 1981 to 2011 extracted from the CMT catalog, with magnitude equal or greater than Mw 5.0. Our analysis based on the subdivision of the Earth surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. To this direction we use the Flinn-Engdahl regionalization (Flinn and Engdahl, 1965), which consists of 50 seismic zones as modified by Lombardi and Marzocchi (2007), where grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method. As a result Lombardi and Marzocchi (2007), limit the initial 50 regions to 39 ones, in which we apply the non- extensive statistical physics approach. The non-extensive statistical physics seems to be the most adequate and promising methodological tool for analyzing complex systems, such as the Earth's interior. In this frame, we introduce the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). In the present work we analyze the interevent time distribution between successive earthquakes by a q-exponential function in each of the seismic zones defined by Lombardi and Marzocchi (2007).confirming the importance of long-range interactions and the existence of a power-law approximation in the distribution of the interevent times. Our findings supports the ideas of universality within the Tsallis approach to describe Earth's seismicity and present strong evidence on temporal clustering of seismic activity in each of the tectonic zones analyzed. Our analysis as applied in worldwide seismicity with magnitude equal or greater than Mw 5.5 and 6.) is presented and the dependence of our result on the cut-off magnitude is discussed. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project of the "Education & Lifelong Learning" Operational Programme.

  12. PEOPLE IN PHYSICS: Nobel prize winners in physics from 1901 to 1990: simple statistics for physics teachers

    NASA Astrophysics Data System (ADS)

    Zhang, Weijia; Fuller, Robert G.

    1998-05-01

    A demographic database for the 139 Nobel prize winners in physics from 1901 to 1990 has been created from a variety of sources. The results of our statistical study are discussed in the light of the implications for physics teaching.

  13. Improving estimates of the number of `fake' leptons and other mis-reconstructed objects in hadron collider events: BoB's your UNCLE

    NASA Astrophysics Data System (ADS)

    Gillam, Thomas P. S.; Lester, Christopher G.

    2014-11-01

    We consider current and alternative approaches to setting limits on new physics signals having backgrounds from misidentified objects; for example jets misidentified as leptons, b-jets or photons. Many ATLAS and CMS analyses have used a heuristic "matrix method" for estimating the background contribution from such sources. We demonstrate that the matrix method suffers from statistical shortcomings that can adversely affect its ability to set robust limits. A rigorous alternative method is discussed, and is seen to produce fake rate estimates and limits with better qualities, but is found to be too costly to use. Having investigated the nature of the approximations used to derive the matrix method, we propose a third strategy that is seen to marry the speed of the matrix method to the performance and physicality of the more rigorous approach.

  14. Physics Based Modeling and Rendering of Vegetation in the Thermal Infrared

    NASA Technical Reports Server (NTRS)

    Smith, J. A.; Ballard, J. R., Jr.

    1999-01-01

    We outline a procedure for rendering physically-based thermal infrared images of simple vegetation scenes. Our approach incorporates the biophysical processes that affect the temperature distribution of the elements within a scene. Computer graphics plays a key role in two respects. First, in computing the distribution of scene shaded and sunlit facets and, second, in the final image rendering once the temperatures of all the elements in the scene have been computed. We illustrate our approach for a simple corn scene where the three-dimensional geometry is constructed based on measured morphological attributes of the row crop. Statistical methods are used to construct a representation of the scene in agreement with the measured characteristics. Our results are quite good. The rendered images exhibit realistic behavior in directional properties as a function of view and sun angle. The root-mean-square error in measured versus predicted brightness temperatures for the scene was 2.1 deg C.

  15. Lindemann histograms as a new method to analyse nano-patterns and phases

    NASA Astrophysics Data System (ADS)

    Makey, Ghaith; Ilday, Serim; Tokel, Onur; Ibrahim, Muhamet; Yavuz, Ozgun; Pavlov, Ihor; Gulseren, Oguz; Ilday, Omer

    The detection, observation, and analysis of material phases and atomistic patterns are of great importance for understanding systems exhibiting both equilibrium and far-from-equilibrium dynamics. As such, there is intense research on phase transitions and pattern dynamics in soft matter, statistical and nonlinear physics, and polymer physics. In order to identify phases and nano-patterns, the pair correlation function is commonly used. However, this approach is limited in terms of recognizing competing patterns in dynamic systems, and lacks visualisation capabilities. In order to solve these limitations, we introduce Lindemann histogram quantification as an alternative method to analyse solid, liquid, and gas phases, along with hexagonal, square, and amorphous nano-pattern symmetries. We show that the proposed approach based on Lindemann parameter calculated per particle maps local number densities to material phase or particles pattern. We apply the Lindemann histogram method on dynamical colloidal self-assembly experimental data and identify competing patterns.

  16. Bayesian component separation: The Planck experience

    NASA Astrophysics Data System (ADS)

    Wehus, Ingunn Kathrine; Eriksen, Hans Kristian

    2018-05-01

    Bayesian component separation techniques have played a central role in the data reduction process of Planck. The most important strength of this approach is its global nature, in which a parametric and physical model is fitted to the data. Such physical modeling allows the user to constrain very general data models, and jointly probe cosmological, astrophysical and instrumental parameters. This approach also supports statistically robust goodness-of-fit tests in terms of data-minus-model residual maps, which are essential for identifying residual systematic effects in the data. The main challenges are high code complexity and computational cost. Whether or not these costs are justified for a given experiment depends on its final uncertainty budget. We therefore predict that the importance of Bayesian component separation techniques is likely to increase with time for intensity mapping experiments, similar to what has happened in the CMB field, as observational techniques mature, and their overall sensitivity improves.

  17. Modelling of physical influences in sea level records for vertical crustal movement detection

    NASA Technical Reports Server (NTRS)

    Anderson, E. G.

    1978-01-01

    Attempts to specify and evaluate such physical influences are reviewed with the intention of identifying problem areas and promising approaches. An example of linear modelling based on air/water temperatures, atmospheric pressure, river discharges, geostrophic and/or local wind velocities, and including forced period terms to allow for the long period tides and Chandlerian polar motion is evaluated and applied to monthly mean sea levels recorded in Atlantic Canada. Refinement of the model to admit phase lag in the response to some of the driving phenomena is demonstrated. Spectral analysis of the residuals is employed to assess the model performance. The results and associated statistical parameters are discussed with emphasis on elucidating the sensitivity of the technique for detection of local episodic and secular vertical crustal movements, the problem areas most critical to the type of approach, and possible further developments.

  18. Approach for estimating the dynamic physical thresholds of phytoplankton production and biomass in the tropical-subtropical Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Gómez-Ocampo, E.; Gaxiola-Castro, G.; Durazo, Reginaldo

    2017-06-01

    Threshold is defined as the point where small changes in an environmental driver produce large responses in the ecosystem. Generalized additive models (GAMs) were used to estimate the thresholds and contribution of key dynamic physical variables in terms of phytoplankton production and variations in biomass in the tropical-subtropical Pacific Ocean off Mexico. The statistical approach used here showed that thresholds were shallower for primary production than for phytoplankton biomass (pycnocline < 68 m and mixed layer < 30 m versus pycnocline < 45 m and mixed layer < 80 m) but were similar for absolute dynamic topography and Ekman pumping (ADT < 59 cm and EkP > 0 cm d-1 versus ADT < 60 cm and EkP > 4 cm d-1). The relatively high productivity on seasonal (spring) and interannual (La Niña 2008) scales was linked to low ADT (45-60 cm) and shallow pycnocline depth (9-68 m) and mixed layer (8-40 m). Statistical estimations from satellite data indicated that the contributions of ocean circulation to phytoplankton variability were 18% (for phytoplankton biomass) and 46% (for phytoplankton production). Although the statistical contribution of models constructed with in situ integrated chlorophyll a and primary production data was lower than the one obtained with satellite data (11%), the fits were better for the former, based on the residual distribution. The results reported here suggest that estimated thresholds may reliably explain the spatial-temporal variations of phytoplankton in the tropical-subtropical Pacific Ocean off the coast of Mexico.

  19. Braiding by Majorana tracking and long-range CNOT gates with color codes

    NASA Astrophysics Data System (ADS)

    Litinski, Daniel; von Oppen, Felix

    2017-11-01

    Color-code quantum computation seamlessly combines Majorana-based hardware with topological error correction. Specifically, as Clifford gates are transversal in two-dimensional color codes, they enable the use of the Majoranas' non-Abelian statistics for gate operations at the code level. Here, we discuss the implementation of color codes in arrays of Majorana nanowires that avoid branched networks such as T junctions, thereby simplifying their realization. We show that, in such implementations, non-Abelian statistics can be exploited without ever performing physical braiding operations. Physical braiding operations are replaced by Majorana tracking, an entirely software-based protocol which appropriately updates the Majoranas involved in the color-code stabilizer measurements. This approach minimizes the required hardware operations for single-qubit Clifford gates. For Clifford completeness, we combine color codes with surface codes, and use color-to-surface-code lattice surgery for long-range multitarget CNOT gates which have a time overhead that grows only logarithmically with the physical distance separating control and target qubits. With the addition of magic state distillation, our architecture describes a fault-tolerant universal quantum computer in systems such as networks of tetrons, hexons, or Majorana box qubits, but can also be applied to nontopological qubit platforms.

  20. Statistical Physics Approaches to Microbial Ecology

    NASA Astrophysics Data System (ADS)

    Mehta, Pankaj

    The unprecedented ability to quantitatively measure and probe complex microbial communities has renewed interest in identifying the fundamental ecological principles governing community ecology in microbial ecosystems. Here, we present work from our group and others showing how ideas from statistical physics can help us uncover these ecological principles. Two major lessons emerge from this work. First, large, ecosystems with many species often display new, emergent ecological behaviors that are absent in small ecosystems with just a few species. To paraphrase Nobel laureate Phil Anderson, ''More is Different'', especially in community ecology. Second, the lack of trophic layer separation in microbial ecology fundamentally distinguishes microbial ecology from classical paradigms of community ecology and leads to qualitative different rules for community assembly in microbes. I illustrate these ideas using both theoretical modeling and novel new experiments on large microbial ecosystems performed by our collaborators (Joshua Goldford and Alvaro Sanchez). Work supported by Simons Investigator in MMLS and NIH R35 R35 GM119461.

  1. Sociocultural Influences on Weight-Related Behaviors in African American Adolescents.

    PubMed

    Tate, Nutrena H; Davis, Jean E; Yarandi, Hossein N

    2015-12-01

    The purpose of this study was to examine the sociocultural factors related to weight behaviors in African American adolescents utilizing a social ecological approach. A descriptive correlational design included a sample of 145 African American adolescents. Perceived familial socialization, ethnic identity, physical activity, and eating behavior patterns were measured. Data were analyzed using descriptive statistics, Pearson product-moment correlations, and multiple regression equations. Perceived maternal socialization was significantly related to adolescent eating behaviors and physical activity whereas perceived paternal socialization was significantly related only to their physical activity. The adolescents' ethnic identity was not significantly related to their eating behaviors or physical activity. Health care providers who work with adolescents and their families can use the initial findings from this study to encourage healthy weight-related behaviors while reducing the obesity epidemic within the African American adolescent population in a developmentally appropriate and culturally sensitive manner. © The Author(s) 2014.

  2. ANEMOS: Development of a next generation wind power forecasting system for the large-scale integration of onshore and offshore wind farms.

    NASA Astrophysics Data System (ADS)

    Kariniotakis, G.; Anemos Team

    2003-04-01

    Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.

  3. NASA Langley's Approach to the Sandia's Structural Dynamics Challenge Problem

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kenny, Sean P.; Crespo, Luis G.; Elliott, Kenny B.

    2007-01-01

    The objective of this challenge is to develop a data-based probabilistic model of uncertainty to predict the behavior of subsystems (payloads) by themselves and while coupled to a primary (target) system. Although this type of analysis is routinely performed and representative of issues faced in real-world system design and integration, there are still several key technical challenges that must be addressed when analyzing uncertain interconnected systems. For example, one key technical challenge is related to the fact that there is limited data on target configurations. Moreover, it is typical to have multiple data sets from experiments conducted at the subsystem level, but often samples sizes are not sufficient to compute high confidence statistics. In this challenge problem additional constraints are placed as ground rules for the participants. One such rule is that mathematical models of the subsystem are limited to linear approximations of the nonlinear physics of the problem at hand. Also, participants are constrained to use these models and the multiple data sets to make predictions about the target system response under completely different input conditions. Our approach involved initially the screening of several different methods. Three of the ones considered are presented herein. The first one is based on the transformation of the modal data to an orthogonal space where the mean and covariance of the data are matched by the model. The other two approaches worked solutions in physical space where the uncertain parameter set is made of masses, stiffnesses and damping coefficients; one matches confidence intervals of low order moments of the statistics via optimization while the second one uses a Kernel density estimation approach. The paper will touch on all the approaches, lessons learned, validation 1 metrics and their comparison, data quantity restriction, and assumptions/limitations of each approach. Keywords: Probabilistic modeling, model validation, uncertainty quantification, kernel density

  4. Advanced statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Heron, K. H.

    1994-09-01

    A high-frequency theory (advanced statistical energy analysis (ASEA)) is developed which takes account of the mechanism of tunnelling and uses a ray theory approach to track the power flowing around a plate or a beam network and then uses statistical energy analysis (SEA) to take care of any residual power. ASEA divides the energy of each sub-system into energy that is freely available for transfer to other sub-systems and energy that is fixed within the sub-systems that are physically separate and can be interpreted as a series of mathematical models, the first of which is identical to standard SEA and subsequent higher order models are convergent on an accurate prediction. Using a structural assembly of six rods as an example, ASEA is shown to converge onto the exact results while SEA is shown to overpredict by up to 60 dB.

  5. A Statistical Approach to Illustrate the Challenge of Astrobiology for Public Outreach.

    PubMed

    Foucher, Frédéric; Hickman-Lewis, Keyron; Westall, Frances; Brack, André

    2017-10-26

    In this study, we attempt to illustrate the competition that constitutes the main challenge of astrobiology, namely the competition between the probability of extraterrestrial life and its detectability. To illustrate this fact, we propose a simple statistical approach based on our knowledge of the Universe and the Milky Way, the Solar System, and the evolution of life on Earth permitting us to obtain the order of magnitude of the distance between Earth and bodies inhabited by more or less evolved past or present life forms, and the consequences of this probability for the detection of associated biosignatures. We thus show that the probability of the existence of evolved extraterrestrial forms of life increases with distance from the Earth while, at the same time, the number of detectable biosignatures decreases due to technical and physical limitations. This approach allows us to easily explain to the general public why it is very improbable to detect a signal of extraterrestrial intelligence while it is justified to launch space probes dedicated to the search for microbial life in the Solar System.

  6. A Statistical Approach to Illustrate the Challenge of Astrobiology for Public Outreach

    PubMed Central

    Westall, Frances; Brack, André

    2017-01-01

    In this study, we attempt to illustrate the competition that constitutes the main challenge of astrobiology, namely the competition between the probability of extraterrestrial life and its detectability. To illustrate this fact, we propose a simple statistical approach based on our knowledge of the Universe and the Milky Way, the Solar System, and the evolution of life on Earth permitting us to obtain the order of magnitude of the distance between Earth and bodies inhabited by more or less evolved past or present life forms, and the consequences of this probability for the detection of associated biosignatures. We thus show that the probability of the existence of evolved extraterrestrial forms of life increases with distance from the Earth while, at the same time, the number of detectable biosignatures decreases due to technical and physical limitations. This approach allows us to easily explain to the general public why it is very improbable to detect a signal of extraterrestrial intelligence while it is justified to launch space probes dedicated to the search for microbial life in the Solar System. PMID:29072614

  7. A random matrix approach to language acquisition

    NASA Astrophysics Data System (ADS)

    Nicolaidis, A.; Kosmidis, Kosmas; Argyrakis, Panos

    2009-12-01

    Since language is tied to cognition, we expect the linguistic structures to reflect patterns that we encounter in nature and are analyzed by physics. Within this realm we investigate the process of lexicon acquisition, using analytical and tractable methods developed within physics. A lexicon is a mapping between sounds and referents of the perceived world. This mapping is represented by a matrix and the linguistic interaction among individuals is described by a random matrix model. There are two essential parameters in our approach. The strength of the linguistic interaction β, which is considered as a genetically determined ability, and the number N of sounds employed (the lexicon size). Our model of linguistic interaction is analytically studied using methods of statistical physics and simulated by Monte Carlo techniques. The analysis reveals an intricate relationship between the innate propensity for language acquisition β and the lexicon size N, N~exp(β). Thus a small increase of the genetically determined β may lead to an incredible lexical explosion. Our approximate scheme offers an explanation for the biological affinity of different species and their simultaneous linguistic disparity.

  8. Estimation of fine-scale recombination intensity variation in the white-echinus interval of D. melanogaster

    PubMed Central

    Singh, Nadia D.; Aquadro, Charles F.; Clark, Andrew G.

    2009-01-01

    Accurate assessment of local recombination rate variation is crucial for understanding the recombination process and for determining the impact of natural selection on linked sites. In Drosophila, local recombination intensity has been estimated primarily by statistical approaches, estimating the local slope of the relationship between the physical and genetic maps. However, these estimates are limited in resolution, and as a result, the physical scale at which recombination intensity varies in Drosophila is largely unknown. While there is some evidence suggesting as much as a 40-fold variation in crossover rate at a local scale in D. pseudoobscura, little is known about the fine-scale structure of recombination rate variation in D. melanogaster. Here, we experimentally examine the fine-scale distribution of crossover events in a 1.2 Mb region on the D. melanogaster X chromosome using a classic genetic mapping approach. Our results show that crossover frequency is significantly heterogeneous within this region, varying ~ 3.5 fold. Simulations suggest that this degree of heterogeneity is sufficient to affect levels of standing nucleotide diversity, although the magnitude of this effect is small. We recover no statistical association between empirical estimates of nucleotide diversity and recombination intensity, which is likely due to the limited number of loci sampled in our population genetic dataset. However, codon bias is significantly negatively correlated with fine-scale recombination intensity estimates, as expected. Our results shed light on the relevant physical scale to consider in evolutionary analyses relating to recombination rate, and highlight the motivations to increase the resolution of the recombination map in Drosophila. PMID:19504037

  9. Evaluation of Colorado Learning Attitudes about Science Survey

    NASA Astrophysics Data System (ADS)

    Douglas, K. A.; Yale, M. S.; Bennett, D. E.; Haugan, M. P.; Bryan, L. A.

    2014-12-01

    The Colorado Learning Attitudes about Science Survey (CLASS) is a widely used instrument designed to measure student attitudes toward physics and learning physics. Previous research revealed a fairly complex factor structure. In this study, exploratory and confirmatory factor analyses were conducted on data from an undergraduate introductory physics course (n =3844 ) to determine whether a more parsimonious factor structure exists. Exploratory factor analysis results indicate that many of the items from the original CLASS have poor psychometric properties and could not be used in a revised factor structure. The cross validation showed acceptable fit statistics for a three factor model found in the exploratory factor analysis. This research suggests that a more optimum measurement of students' attitudes about physics and learning physics is obtained with a 15-item instrument, which describes the factors of personal application, personal effort, and problem solving. The proposed revised version of the CLASS offers researchers the opportunity to test a shortened version of the instrument that may be able to provide information about students' attitudes in the areas of personal application of physics, personal effort in a physics course, and approaches to problem solving.

  10. General covariance, topological quantum field theories and fractional statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamboa, J.

    1992-01-20

    Topological quantum field theories and fractional statistics are both defined in multiply connected manifolds. The authors study the relationship between both theories in 2 + 1 dimensions and the authors show that, due to the multiply-connected character of the manifold, the propagator for any quantum (field) theory always contains a first order pole that can be identified with a physical excitation with fractional spin. The article starts by reviewing the definition of general covariance in the Hamiltonian formalism, the gauge-fixing problem and the quantization following the lines of Batalin, Fradkin and Vilkovisky. The BRST-BFV quantization is reviewed in order tomore » understand the topological approach proposed here.« less

  11. Deterministic annealing for density estimation by multivariate normal mixtures

    NASA Astrophysics Data System (ADS)

    Kloppenburg, Martin; Tavan, Paul

    1997-03-01

    An approach to maximum-likelihood density estimation by mixtures of multivariate normal distributions for large high-dimensional data sets is presented. Conventionally that problem is tackled by notoriously unstable expectation-maximization (EM) algorithms. We remove these instabilities by the introduction of soft constraints, enabling deterministic annealing. Our developments are motivated by the proof that algorithmically stable fuzzy clustering methods that are derived from statistical physics analogs are special cases of EM procedures.

  12. Proactive and Progressive Approaches in Managing Obesity.

    PubMed

    Eckel, Robert H; Bays, Harold E; Klein, Samuel; Bade Horn, Deborah

    2016-10-01

    Despite the advice clinicians have been giving patients about the importance of restricting their food intake and increasing physical activity levels, the Centers for Disease Control and Prevention (CDC) states that 78.6 million adults in the United States (US) are still obese. With these statistics in mind, this symposium provided insights on the genetic, cultural, and environmental underpinning of obesity and discussed the latest research on pharmacotherapy, surgery, and the need to individualize treatment.

  13. Continuum-Kinetic Models and Numerical Methods for Multiphase Applications

    NASA Astrophysics Data System (ADS)

    Nault, Isaac Michael

    This thesis presents a continuum-kinetic approach for modeling general problems in multiphase solid mechanics. In this context, a continuum model refers to any model, typically on the macro-scale, in which continuous state variables are used to capture the most important physics: conservation of mass, momentum, and energy. A kinetic model refers to any model, typically on the meso-scale, which captures the statistical motion and evolution of microscopic entitites. Multiphase phenomena usually involve non-negligible micro or meso-scopic effects at the interfaces between phases. The approach developed in the thesis attempts to combine the computational performance benefits of a continuum model with the physical accuracy of a kinetic model when applied to a multiphase problem. The approach is applied to modeling a single particle impact in Cold Spray, an engineering process that intimately involves the interaction of crystal grains with high-magnitude elastic waves. Such a situation could be classified a multiphase application due to the discrete nature of grains on the spatial scale of the problem. For this application, a hyper elasto-plastic model is solved by a finite volume method with approximate Riemann solver. The results of this model are compared for two types of plastic closure: a phenomenological macro-scale constitutive law, and a physics-based meso-scale Crystal Plasticity model.

  14. Is math anxiety in the secondary classroom limiting physics mastery? A study of math anxiety and physics performance

    NASA Astrophysics Data System (ADS)

    Mercer, Gary J.

    This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.

  15. Calibration of raw accelerometer data to measure physical activity: A systematic review.

    PubMed

    de Almeida Mendes, Márcio; da Silva, Inácio C M; Ramires, Virgílio V; Reichert, Felipe F; Martins, Rafaela C; Tomasi, Elaine

    2018-03-01

    Most of calibration studies based on accelerometry were developed using count-based analyses. In contrast, calibration studies based on raw acceleration signals are relatively recent and their evidences are incipient. The aim of the current study was to systematically review the literature in order to summarize methodological characteristics and results from raw data calibration studies. The review was conducted up to May 2017 using four databases: PubMed, Scopus, SPORTDiscus and Web of Science. Methodological quality of the included studies was evaluated using the Landis and Koch's guidelines. Initially, 1669 titles were identified and, after assessing titles, abstracts and full-articles, 20 studies were included. All studies were conducted in high-income countries, most of them with relatively small samples and specific population groups. Physical activity protocols were different among studies and the indirect calorimetry was the criterion measure mostly used. High mean values of sensitivity, specificity and accuracy from the intensity thresholds of cut-point-based studies were observed (93.7%, 91.9% and 95.8%, respectively). The most frequent statistical approach applied was machine learning-based modelling, in which the mean coefficient of determination was 0.70 to predict physical activity energy expenditure. Regarding the recognition of physical activity types, the mean values of accuracy for sedentary, household and locomotive activities were 82.9%, 55.4% and 89.7%, respectively. In conclusion, considering the construct of physical activity that each approach assesses, linear regression, machine-learning and cut-point-based approaches presented promising validity parameters. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A Comparison of Physical and Technical Performance Profiles Between Successful and Less-Successful Professional Rugby League Teams.

    PubMed

    Kempton, Thomas; Sirotic, Anita C; Coutts, Aaron J

    2017-04-01

    To examine differences in physical and technical performance profiles using a large sample of match observations drawn from successful and less-successful professional rugby league teams. Match activity profiles were collected using global positioning satellite (GPS) technology from 29 players from a successful rugby league team during 24 games and 25 players from a less-successful team during 18 games throughout 2 separate competition seasons. Technical performance data were obtained from a commercial statistics provider. A progressive magnitude-based statistical approach was used to compare differences in physical and technical performance variables between the reference teams. There were no clear differences in playing time, absolute and relative total distances, or low-speed running distances between successful and less-successful teams. The successful team possibly to very likely had lower higher-speed running demands and likely had fewer physical collisions than the less-successful team, although they likely to most likely demonstrated more accelerations and decelerations and likely had higher average metabolic power. The successful team very likely gained more territory in attack, very likely had more possessions, and likely committed fewer errors. In contrast, the less-successful team was likely required to attempt more tackles, most likely missed more tackles, and very likely had a lower effective tackle percentage. In the current study, successful match performance was not contingent on higher match running outputs or more physical collisions; rather, proficiency in technical performance components better differentiated successful and less-successful teams.

  17. Statistical Physics on the Eve of the 21st Century: in Honour of J B McGuire on the Occasion of His 65th Birthday

    NASA Astrophysics Data System (ADS)

    Batchelor, Murray T.; Wille, Luc T.

    The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in the Star-Triangle Relations * A Self-Avoiding Walk Through Exactly Solved Lattice Models in Statistical Mechanics

  18. The effect of restructuring student writing in the general chemistry laboratory on student understanding of chemistry and on students' approach to the laboratory course

    NASA Astrophysics Data System (ADS)

    Rudd, James Andrew, II

    Many students encounter difficulties engaging with laboratory-based instruction, and reviews of research have indicated that the value of such instruction is not clearly evident. Traditional forms of writing associated with laboratory activities are commonly in a style used by professional scientists to communicate developed explanations. Students probably lack the interpretative skills of a professional, and writing in this style may not support students in learning how to develop scientific explanations. The Science Writing Heuristic (SWH) is an inquiry-based approach to laboratory instruction designed in part to promote student ability in developing such explanations. However, there is not a convincing body of evidence for the superiority of inquiry-based laboratory instruction in chemistry. In a series of studies, the performance of students using the SWH student template in place of the standard laboratory report format was compared to the performance of students using the standard format. The standard reports had Title, Purpose, Procedure, Data & Observations, Calculations & Graphs, and Discussion sections. The SWH reports had Beginning Questions & Ideas, Tests & Procedures, Observations, Claims, Evidence, and Reflection sections. The pilot study produced evidence that using the SWH improved the quality of laboratory reports, improved student performance on a laboratory exam, and improved student approach to laboratory work. A main study found that SWH students statistically exhibited a better understanding of physical equilibrium when written explanations and equations were analyzed on a lecture exam and performed descriptively better on a physical equilibrium practical exam task. In another main study, the activities covering the general equilibrium concept were restructured as an additional change, and it was found that SWH students exhibited a better understanding of chemical equilibrium as shown by statistically greater success in overcoming the common confusion of interpreting equilibrium as equal concentrations and by statistically better performance when explaining aspects of chemical equilibrium. Both main studies found that students and instructors spent less time on the SWH reports and that students preferred the SWH approach because it increased their level of mental engagement. The studies supported the conclusion that inquiry-based laboratory instruction benefits student learning and attitudes.

  19. Rotation of EOFs by the Independent Component Analysis: Towards A Solution of the Mixing Problem in the Decomposition of Geophysical Time Series

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)

    2001-01-01

    The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.

  20. History and physical examination findings predictive of testicular torsion: an attempt to promote clinical diagnosis by house staff.

    PubMed

    Srinivasan, Arun; Cinman, Nadya; Feber, Kevin M; Gitlin, Jordan; Palmer, Lane S

    2011-08-01

    To standardize the history and physical examination of boys who present with acute scrotum and identify parameters that best predict testicular torsion. Over a 5-month period, a standardized history and physical examination form with 22 items was used for all boys presenting with scrotal pain. Management decisions for radiological evaluation and surgical intervention were based on the results. Data were statistically analyzed in correlation with the eventual diagnosis. Of the 79 boys evaluated, 8 (10.1%) had testicular torsion. On univariate analysis, age, worsening pain, nausea/vomiting, severe pain at rest, absence of ipsilateral cremaster reflex, abnormal testicular position and scrotal skin changes were statistically predictive of torsion. After multivariate analysis and adjusting for confounding effect of other co-existing variables, absence of ipsilateral cremaster reflex (P < 0.001), nausea/vomiting (P < 0.05) and scrotal skin changes (P < 0.001) were the only consistent predictive factors of testicular torsion. An accurate history and physical examination of boys with acute scrotum should be primary in deciding upon further radiographic or surgical evaluation. While several forces have led to less consistent overnight resident staffing, consistent and reliable clinical evaluation of the acute scrotum using a standardized approach should reduce error, improve patient care and potentially reduce health care costs. Copyright © 2011 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  1. Quantum Mechanics and the Principle of Least Radix Economy

    NASA Astrophysics Data System (ADS)

    Garcia-Morales, Vladimir

    2015-03-01

    A new variational method, the principle of least radix economy, is formulated. The mathematical and physical relevance of the radix economy, also called digit capacity, is established, showing how physical laws can be derived from this concept in a unified way. The principle reinterprets and generalizes the principle of least action yielding two classes of physical solutions: least action paths and quantum wavefunctions. A new physical foundation of the Hilbert space of quantum mechanics is then accomplished and it is used to derive the Schrödinger and Dirac equations and the breaking of the commutativity of spacetime geometry. The formulation provides an explanation of how determinism and random statistical behavior coexist in spacetime and a framework is developed that allows dynamical processes to be formulated in terms of chains of digits. These methods lead to a new (pre-geometrical) foundation for Lorentz transformations and special relativity. The Parker-Rhodes combinatorial hierarchy is encompassed within our approach and this leads to an estimate of the interaction strength of the electromagnetic and gravitational forces that agrees with the experimental values to an error of less than one thousandth. Finally, it is shown how the principle of least-radix economy naturally gives rise to Boltzmann's principle of classical statistical thermodynamics. A new expression for a general (path-dependent) nonequilibrium entropy is proposed satisfying the Second Law of Thermodynamics.

  2. Student Understanding of Taylor Series Expansions in Statistical Mechanics

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Thompson, John R.; Mountcastle, Donald B.

    2013-01-01

    One goal of physics instruction is to have students learn to make physical meaning of specific mathematical expressions, concepts, and procedures in different physical settings. As part of research investigating student learning in statistical physics, we are developing curriculum materials that guide students through a derivation of the Boltzmann…

  3. Does size matter? Statistical limits of paleomagnetic field reconstruction from small rock specimens

    NASA Astrophysics Data System (ADS)

    Berndt, Thomas; Muxworthy, Adrian R.; Fabian, Karl

    2016-01-01

    As samples of ever decreasing sizes are being studied paleomagnetically, care has to be taken that the underlying assumptions of statistical thermodynamics (Maxwell-Boltzmann statistics) are being met. Here we determine how many grains and how large a magnetic moment a sample needs to have to be able to accurately record an ambient field. It is found that for samples with a thermoremanent magnetic moment larger than 10-11Am2 the assumption of a sufficiently large number of grains is usually given. Standard 25 mm diameter paleomagnetic samples usually contain enough magnetic grains such that statistical errors are negligible, but "single silicate crystal" works on, for example, zircon, plagioclase, and olivine crystals are approaching the limits of what is physically possible, leading to statistic errors in both the angular deviation and paleointensity that are comparable to other sources of error. The reliability of nanopaleomagnetic imaging techniques capable of resolving individual grains (used, for example, to study the cloudy zone in meteorites), however, is questionable due to the limited area of the material covered.

  4. Statistical uncertainty of extreme wind storms over Europe derived from a probabilistic clustering technique

    NASA Astrophysics Data System (ADS)

    Walz, Michael; Leckebusch, Gregor C.

    2016-04-01

    Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.

  5. Plackett-Burman experimental design to facilitate syntactic foam development

    DOE PAGES

    Smith, Zachary D.; Keller, Jennie R.; Bello, Mollie; ...

    2015-09-14

    The use of an eight-experiment Plackett–Burman method can assess six experimental variables and eight responses in a polysiloxane-glass microsphere syntactic foam. The approach aims to decrease the time required to develop a tunable polymer composite by identifying a reduced set of variables and responses suitable for future predictive modeling. The statistical design assesses the main effects of mixing process parameters, polymer matrix composition, microsphere density and volume loading, and the blending of two grades of microspheres, using a dummy factor in statistical calculations. Responses cover rheological, physical, thermal, and mechanical properties. The cure accelerator content of the polymer matrix andmore » the volume loading of the microspheres have the largest effects on foam properties. These factors are the most suitable for controlling the gel point of the curing foam, and the density of the cured foam. The mixing parameters introduce widespread variability and therefore should be fixed at effective levels during follow-up testing. Some responses may require greater contrast in microsphere-related factors. As a result, compared to other possible statistical approaches, the run economy of the Plackett–Burman method makes it a valuable tool for rapidly characterizing new foams.« less

  6. Southeast Atlantic Cloud Properties in a Multivariate Statistical Model - How Relevant is Air Mass History for Local Cloud Properties?

    NASA Astrophysics Data System (ADS)

    Fuchs, Julia; Cermak, Jan; Andersen, Hendrik

    2017-04-01

    This study aims at untangling the impacts of external dynamics and local conditions on cloud properties in the Southeast Atlantic (SEA) by combining satellite and reanalysis data using multivariate statistics. The understanding of clouds and their determinants at different scales is important for constraining the Earth's radiative budget, and thus prominent in climate-system research. In this study, SEA stratocumulus cloud properties are observed not only as the result of local environmental conditions but also as affected by external dynamics and spatial origins of air masses entering the study area. In order to assess to what extent cloud properties are impacted by aerosol concentration, air mass history, and meteorology, a multivariate approach is conducted using satellite observations of aerosol and cloud properties (MODIS, SEVIRI), information on aerosol species composition (MACC) and meteorological context (ERA-Interim reanalysis). To account for the often-neglected but important role of air mass origin, information on air mass history based on HYSPLIT modeling is included in the statistical model. This multivariate approach is intended to lead to a better understanding of the physical processes behind observed stratocumulus cloud properties in the SEA.

  7. Systematic and statistical uncertainties in simulated r-process abundances due to uncertain nuclear masses

    DOE PAGES

    Surman, Rebecca; Mumpower, Matthew; McLaughlin, Gail

    2017-02-27

    Unknown nuclear masses are a major source of nuclear physics uncertainty for r-process nucleosynthesis calculations. Here we examine the systematic and statistical uncertainties that arise in r-process abundance predictions due to uncertainties in the masses of nuclear species on the neutron-rich side of stability. There is a long history of examining systematic uncertainties by the application of a variety of different mass models to r-process calculations. Here we expand upon such efforts by examining six DFT mass models, where we capture the full impact of each mass model by updating the other nuclear properties — including neutron capture rates, β-decaymore » lifetimes, and β-delayed neutron emission probabilities — that depend on the masses. Unlike systematic effects, statistical uncertainties in the r-process pattern have just begun to be explored. Here we apply a global Monte Carlo approach, starting from the latest FRDM masses and considering random mass variations within the FRDM rms error. Here, we find in each approach that uncertain nuclear masses produce dramatic uncertainties in calculated r-process yields, which can be reduced in upcoming experimental campaigns.« less

  8. Systematic and statistical uncertainties in simulated r-process abundances due to uncertain nuclear masses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Surman, Rebecca; Mumpower, Matthew; McLaughlin, Gail

    Unknown nuclear masses are a major source of nuclear physics uncertainty for r-process nucleosynthesis calculations. Here we examine the systematic and statistical uncertainties that arise in r-process abundance predictions due to uncertainties in the masses of nuclear species on the neutron-rich side of stability. There is a long history of examining systematic uncertainties by the application of a variety of different mass models to r-process calculations. Here we expand upon such efforts by examining six DFT mass models, where we capture the full impact of each mass model by updating the other nuclear properties — including neutron capture rates, β-decaymore » lifetimes, and β-delayed neutron emission probabilities — that depend on the masses. Unlike systematic effects, statistical uncertainties in the r-process pattern have just begun to be explored. Here we apply a global Monte Carlo approach, starting from the latest FRDM masses and considering random mass variations within the FRDM rms error. Here, we find in each approach that uncertain nuclear masses produce dramatic uncertainties in calculated r-process yields, which can be reduced in upcoming experimental campaigns.« less

  9. Projecting changes in the distribution and productivity of living marine resources: A critical review of the suite of modelling approaches used in the large European project VECTORS

    NASA Astrophysics Data System (ADS)

    Peck, Myron A.; Arvanitidis, Christos; Butenschön, Momme; Canu, Donata Melaku; Chatzinikolaou, Eva; Cucco, Andrea; Domenici, Paolo; Fernandes, Jose A.; Gasche, Loic; Huebert, Klaus B.; Hufnagl, Marc; Jones, Miranda C.; Kempf, Alexander; Keyl, Friedemann; Maar, Marie; Mahévas, Stéphanie; Marchal, Paul; Nicolas, Delphine; Pinnegar, John K.; Rivot, Etienne; Rochette, Sébastien; Sell, Anne F.; Sinerchia, Matteo; Solidoro, Cosimo; Somerfield, Paul J.; Teal, Lorna R.; Travers-Trolet, Morgan; van de Wolfshaar, Karen E.

    2018-02-01

    We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.

  10. Statistical analysis of bankrupting and non-bankrupting stocks

    NASA Astrophysics Data System (ADS)

    Li, Qian; Wang, Fengzhong; Wei, Jianrong; Liang, Yuan; Huang, Jiping; Stanley, H. Eugene

    2012-04-01

    The recent financial crisis has caused extensive world-wide economic damage, affecting in particular those who invested in companies that eventually filed for bankruptcy. A better understanding of stocks that become bankrupt would be helpful in reducing risk in future investments. Economists have conducted extensive research on this topic, and here we ask whether statistical physics concepts and approaches may offer insights into pre-bankruptcy stock behavior. To this end, we study all 20092 stocks listed in US stock markets for the 20-year period 1989-2008, including 4223 (21 percent) that became bankrupt during that period. We find that, surprisingly, the distributions of the daily returns of those stocks that become bankrupt differ significantly from those that do not. Moreover, these differences are consistent for the entire period studied. We further study the relation between the distribution of returns and the length of time until bankruptcy, and observe that larger differences of the distribution of returns correlate with shorter time periods preceding bankruptcy. This behavior suggests that sharper fluctuations in the stock price occur when the stock is closer to bankruptcy. We also analyze the cross-correlations between the return and the trading volume, and find that stocks approaching bankruptcy tend to have larger return-volume cross-correlations than stocks that are not. Furthermore, the difference increases as bankruptcy approaches. We conclude that before a firm becomes bankrupt its stock exhibits unusual behavior that is statistically quantifiable.

  11. Information and material flows in complex networks

    NASA Astrophysics Data System (ADS)

    Helbing, Dirk; Armbruster, Dieter; Mikhailov, Alexander S.; Lefeber, Erjen

    2006-04-01

    In this special issue, an overview of the Thematic Institute (TI) on Information and Material Flows in Complex Systems is given. The TI was carried out within EXYSTENCE, the first EU Network of Excellence in the area of complex systems. Its motivation, research approach and subjects are presented here. Among the various methods used are many-particle and statistical physics, nonlinear dynamics, as well as complex systems, network and control theory. The contributions are relevant for complex systems as diverse as vehicle and data traffic in networks, logistics, production, and material flows in biological systems. The key disciplines involved are socio-, econo-, traffic- and bio-physics, and a new research area that could be called “biologistics”.

  12. Wave turbulence

    NASA Astrophysics Data System (ADS)

    Nazarenko, Sergey

    2015-07-01

    Wave turbulence is the statistical mechanics of random waves with a broadband spectrum interacting via non-linearity. To understand its difference from non-random well-tuned coherent waves, one could compare the sound of thunder to a piece of classical music. Wave turbulence is surprisingly common and important in a great variety of physical settings, starting with the most familiar ocean waves to waves at quantum scales or to much longer waves in astrophysics. We will provide a basic overview of the wave turbulence ideas, approaches and main results emphasising the physics of the phenomena and using qualitative descriptions avoiding, whenever possible, involved mathematical derivations. In particular, dimensional analysis will be used for obtaining the key scaling solutions in wave turbulence - Kolmogorov-Zakharov (KZ) spectra.

  13. Upon Generating Discrete Expanding Integrable Models of the Toda Lattice Systems and Infinite Conservation Laws

    NASA Astrophysics Data System (ADS)

    Zhang, Yufeng; Zhang, Xiangzhi; Wang, Yan; Liu, Jiangen

    2017-01-01

    With the help of R-matrix approach, we present the Toda lattice systems that have extensive applications in statistical physics and quantum physics. By constructing a new discrete integrable formula by R-matrix, the discrete expanding integrable models of the Toda lattice systems and their Lax pairs are generated, respectively. By following the constructing formula again, we obtain the corresponding (2+1)-dimensional Toda lattice systems and their Lax pairs, as well as their (2+1)-dimensional discrete expanding integrable models. Finally, some conservation laws of a (1+1)-dimensional generalised Toda lattice system and a new (2+1)-dimensional lattice system are generated, respectively.

  14. Cluster expansion for ground states of local Hamiltonians

    NASA Astrophysics Data System (ADS)

    Bastianello, Alvise; Sotiriadis, Spyros

    2016-08-01

    A central problem in many-body quantum physics is the determination of the ground state of a thermodynamically large physical system. We construct a cluster expansion for ground states of local Hamiltonians, which naturally incorporates physical requirements inherited by locality as conditions on its cluster amplitudes. Applying a diagrammatic technique we derive the relation of these amplitudes to thermodynamic quantities and local observables. Moreover we derive a set of functional equations that determine the cluster amplitudes for a general Hamiltonian, verify the consistency with perturbation theory and discuss non-perturbative approaches. Lastly we verify the persistence of locality features of the cluster expansion under unitary evolution with a local Hamiltonian and provide applications to out-of-equilibrium problems: a simplified proof of equilibration to the GGE and a cumulant expansion for the statistics of work, for an interacting-to-free quantum quench.

  15. Feasibility of Coherent and Incoherent Backscatter Experiments from the AMPS Laboratory. Technical Section

    NASA Technical Reports Server (NTRS)

    Mozer, F. S.

    1976-01-01

    A computer program simulated the spectrum which resulted when a radar signal was transmitted into the ionosphere for a finite time and received for an equal finite interval. The spectrum derived from this signal is statistical in nature because the signal is scattered from the ionosphere, which is statistical in nature. Many estimates of any property of the ionosphere can be made. Their average value will approach the average property of the ionosphere which is being measured. Due to the statistical nature of the spectrum itself, the estimators will vary about this average. The square root of the variance about this average is called the standard deviation, an estimate of the error which exists in any particular radar measurement. In order to determine the feasibility of the space shuttle radar, the magnitude of these errors for measurements of physical interest must be understood.

  16. The contribution of statistical physics to evolutionary biology.

    PubMed

    de Vladar, Harold P; Barton, Nicholas H

    2011-08-01

    Evolutionary biology shares many concepts with statistical physics: both deal with populations, whether of molecules or organisms, and both seek to simplify evolution in very many dimensions. Often, methodologies have undergone parallel and independent development, as with stochastic methods in population genetics. Here, we discuss aspects of population genetics that have embraced methods from physics: non-equilibrium statistical mechanics, travelling waves and Monte-Carlo methods, among others, have been used to study polygenic evolution, rates of adaptation and range expansions. These applications indicate that evolutionary biology can further benefit from interactions with other areas of statistical physics; for example, by following the distribution of paths taken by a population through time. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. A Comparison between the WATCH Flare Data Statistical Properties and Predictions of the Statistical Flare Model

    NASA Astrophysics Data System (ADS)

    Crosby, N.; Georgoulis, M.; Vilmer, N.

    1999-10-01

    Solar burst observations in the deka-keV energy range originating from the WATCH experiment aboard the GRANAT spacecraft were used to perform frequency distributions built on measured X-ray flare parameters (Crosby et al., 1998). The results of the study show that: 1- the overall distribution functions are robust power laws extending over a number of decades. The typical parameters of events (total counts, peak count rates, duration) are all correlated to each other. 2- the overall distribution functions are the convolution of significantly different distribution functions built on parts of the whole data set filtered by the event duration. These "partial" frequency distributions are still power law distributions over several decades, with a slope systematically decreasing with increasing duration. 3- No correlation is found between the elapsed time interval between successive bursts arising from the same active region and the peak intensity of the flare. In this paper, we attempt a tentative comparison between the statistical properties of the self-organized critical (SOC) cellular automaton statistical flare models (see e.g. Lu and Hamilton (1991), Georgoulis and Vlahos (1996, 1998)) and the respective properties of the WATCH flare data. Despite the inherent weaknesses of the SOC models to simulate a number of physical processes in the active region, it is found that most of the observed statistical properties can be reproduced using the SOC models, including the various frequency distributions and scatter plots. We finally conclude that, even if SOC models must be refined to improve the physical links to MHD approaches, they nevertheless represent a good approach to describe the properties of rapid energy dissipation and magnetic field annihilation in complex and magnetized plasmas. Crosby N., Vilmer N., Lund N. and Sunyaev R., A&A; 334; 299-313; 1998 Crosby N., Lund N., Vilmer N. and Sunyaev R.; A&A Supplement Series; 130, 233, 1998 Georgoulis M. and Vlahos L., 1996, Astrophy. J. Letters, 469, L135 Georgoulis M. and Vlahos L., 1998, in preparation Lu E.T. and Hamilton R.J., 1991, Astroph. J., 380, L89

  18. Social marketing approaches to nutrition and physical activity interventions in early care and education centres: a systematic review.

    PubMed

    Luecking, C T; Hennink-Kaminski, H; Ihekweazu, C; Vaughn, A; Mazzucca, S; Ward, D S

    2017-12-01

    Social marketing is a promising planning approach for influencing voluntary lifestyle behaviours, but its application to nutrition and physical activity interventions in the early care and education setting remains unknown. PubMed, ISI Web of Science, PsycInfo and the Cumulative Index of Nursing and Allied Health were systematically searched to identify interventions targeting nutrition and/or physical activity behaviours of children enrolled in early care centres between 1994 and 2016. Content analysis methods were used to capture information reflecting eight social marketing benchmark criteria. The review included 135 articles representing 77 interventions. Two interventions incorporated all eight benchmark criteria, but the majority included fewer than four. Each intervention included behaviour and methods mix criteria, and more than half identified audience segments. Only one-third of interventions incorporated customer orientation, theory, exchange and insight. Only six interventions addressed competing behaviours. We did not find statistical significance for the effectiveness of interventions on child-level diet, physical activity or anthropometric outcomes based on the number of benchmark criteria used. This review highlights opportunities to apply social marketing to obesity prevention interventions in early care centres. Social marketing could be an important strategy for early childhood obesity prevention efforts, and future research investigations into its effects are warranted. © 2017 World Obesity Federation.

  19. Coupled disease-behavior dynamics on complex networks: A review

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Andrews, Michael A.; Wu, Zhi-Xi; Wang, Lin; Bauch, Chris T.

    2015-12-01

    It is increasingly recognized that a key component of successful infection control efforts is understanding the complex, two-way interaction between disease dynamics and human behavioral and social dynamics. Human behavior such as contact precautions and social distancing clearly influence disease prevalence, but disease prevalence can in turn alter human behavior, forming a coupled, nonlinear system. Moreover, in many cases, the spatial structure of the population cannot be ignored, such that social and behavioral processes and/or transmission of infection must be represented with complex networks. Research on studying coupled disease-behavior dynamics in complex networks in particular is growing rapidly, and frequently makes use of analysis methods and concepts from statistical physics. Here, we review some of the growing literature in this area. We contrast network-based approaches to homogeneous-mixing approaches, point out how their predictions differ, and describe the rich and often surprising behavior of disease-behavior dynamics on complex networks, and compare them to processes in statistical physics. We discuss how these models can capture the dynamics that characterize many real-world scenarios, thereby suggesting ways that policy makers can better design effective prevention strategies. We also describe the growing sources of digital data that are facilitating research in this area. Finally, we suggest pitfalls which might be faced by researchers in the field, and we suggest several ways in which the field could move forward in the coming years.

  20. Pooled Genome-Wide Analysis to Identify Novel Risk Loci for Pediatric Allergic Asthma

    PubMed Central

    Ricci, Giampaolo; Astolfi, Annalisa; Remondini, Daniel; Cipriani, Francesca; Formica, Serena; Dondi, Arianna; Pession, Andrea

    2011-01-01

    Background Genome-wide association studies of pooled DNA samples were shown to be a valuable tool to identify candidate SNPs associated to a phenotype. No such study was up to now applied to childhood allergic asthma, even if the very high complexity of asthma genetics is an appropriate field to explore the potential of pooled GWAS approach. Methodology/Principal Findings We performed a pooled GWAS and individual genotyping in 269 children with allergic respiratory diseases comparing allergic children with and without asthma. We used a modular approach to identify the most significant loci associated with asthma by combining silhouette statistics and physical distance method with cluster-adapted thresholding. We found 97% concordance between pooled GWAS and individual genotyping, with 36 out of 37 top-scoring SNPs significant at individual genotyping level. The most significant SNP is located inside the coding sequence of C5, an already identified asthma susceptibility gene, while the other loci regulate functions that are relevant to bronchial physiopathology, as immune- or inflammation-mediated mechanisms and airway smooth muscle contraction. Integration with gene expression data showed that almost half of the putative susceptibility genes are differentially expressed in experimental asthma mouse models. Conclusion/Significance Combined silhouette statistics and cluster-adapted physical distance threshold analysis of pooled GWAS data is an efficient method to identify candidate SNP associated to asthma development in an allergic pediatric population. PMID:21359210

  1. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  2. Complex networks as a unified framework for descriptive analysis and predictive modeling in climate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R

    The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less

  3. The study on the new approach to the prediction of the solar flares: The statistical relation from the SOHO archive

    NASA Astrophysics Data System (ADS)

    Lee, S.; Oh, S.; Lee, J.; Hong, S.

    2013-12-01

    We have investigated the statistical relationship of the solar active region to predict the solar flare event analyzing the sunspot catalogue, which has been newly constructed from the SOHO MDI observation data during the period from 1996 to 2011 (Solar Cycle 23 & 24) by ASSA(Automatic Solar Synoptic Analyzer) algorithms. The prediction relation has been made by machine-learning algorithms to establish a short- term flare prediction model for operational use in near future. In this study, continuum and magnetogram images observed by SOHO has been processed to yield 15-year sunspot group catalogue that contains various physical parameters such as sunspot area, extent, asymmetry measure of largest penumbral sunspot, roughness of magnetic neutral line as well as McIntosh and Mt. Wilson classification results.The latest result of our study will be presented and the new approach to the prediction of the solar flare will be discussed.

  4. Additive effects in high-voltage layered-oxide cells: A statistics of mixtures approach

    DOE PAGES

    Sahore, Ritu; Peebles, Cameron; Abraham, Daniel P.; ...

    2017-07-20

    Li 1.03(Ni 0.5Mn 0.3Co 0.2) 0.97O 2 (NMC)-based coin cells containing the electrolyte additives vinylene carbonate (VC) and tris(trimethylsilyl)phosphite (TMSPi) in the range of 0-2 wt% were cycled between 3.0 and 4.4 V. The changes in capacity at rates of C/10 and C/1 and resistance at 60% state of charge were found to follow linear-with-time kinetic rate laws. Further, the C/10 capacity and resistance data were amenable to modeling by a statistics of mixtures approach. Applying physical meaning to the terms in the empirical models indicated that the interactions between the electrolyte and additives were not simple. For example, theremore » were strong, synergistic interactions between VC and TMSPi affecting C/10 capacity loss, as expected, but there were other, more subtle interactions between the electrolyte components. In conclusion, the interactions between these components controlled the C/10 capacity decline and resistance increase.« less

  5. Quantum origin of the primordial fluctuation spectrum and its statistics

    NASA Astrophysics Data System (ADS)

    Landau, Susana; León, Gabriel; Sudarsky, Daniel

    2013-07-01

    The usual account for the origin of cosmic structure during inflation is not fully satisfactory, as it lacks a physical mechanism capable of generating the inhomogeneity and anisotropy of our Universe, from an exactly homogeneous and isotropic initial state associated with the early inflationary regime. The proposal in [A. Perez, H. Sahlmann, and D. Sudarsky, Classical Quantum Gravity 23, 2317 (2006)] considers the spontaneous dynamical collapse of the wave function as a possible answer to that problem. In this work, we review briefly the difficulties facing the standard approach, as well as the answers provided by the above proposal and explore their relevance to the investigations concerning the characterization of the primordial spectrum and other statistical aspects of the cosmic microwave background and large-scale matter distribution. We will see that the new approach leads to novel ways of considering some of the relevant questions, and, in particular, to distinct characterizations of the non-Gaussianities that might have left imprints on the available data.

  6. On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Quadt, Ralf

    1990-10-01

    Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.

  7. Impact of cosmetic result on selection of surgical treatment in patients with localized prostate cancer.

    PubMed

    Rojo, María Alejandra Egui; Martinez-Salamanca, Juan Ignacio; Maestro, Mario Alvarez; Galarza, Ignacio Sola; Rodriguez, Joaquin Carballido

    2014-01-01

    To analyze the effect of cosmetic outcome as an isolated variable in patients undergoing surgical treatment based on the incision used in the 3 variants of radical prostatectomy: open (infraumbilical incision and Pfannestiel incision) and laparoscopic, or robotic (6 ports) surgery. 612 male patients 40 to 70 years of age with a negative history of prostate disease were invited to participate. Each patient was evaluated by questionnaire accompanied by a set of 6 photographs showing the cosmetic appearance of the 3 approaches, with and without undergarments. Participants ranked the approaches according to preference, on the basis of cosmesis. We also recorded demographic variables: age, body mass index, marital status, education level, and physical activity. Of the 577 patients who completed the questionnaries, the 6-port minimally invasive approach represents the option preferred by 52% of the participants, followed by the Pfannestiel incision (46%), and the infraumbilical incision (11%), respectively. The univariate and multivariate analyses did not show statistically significant differences when comparing the approach preferred by the patients and the sub-analyses for demographic variables, except for patients who exercised who preferred the Pfannestiel incision (58%) instead of minimally invasive approach (42%) with statistically significant differences. The minimally invasive approach was the approach of choice for the majority of patients in the treatment of prostate cancer. The Pfannestiel incision represents an acceptable alternative. More research and investment may be necesary to improve cosmetic outcomes.

  8. Seeking parsimony in hydrology and water resources technology

    NASA Astrophysics Data System (ADS)

    Koutsoyiannis, D.

    2009-04-01

    The principle of parsimony, also known as the principle of simplicity, the principle of economy and Ockham's razor, advises scientists to prefer the simplest theory among those that fit the data equally well. In this, it is an epistemic principle but reflects an ontological characterization that the universe is ultimately parsimonious. Is this principle useful and can it really be reconciled with, and implemented to, our modelling approaches of complex hydrological systems, whose elements and events are extraordinarily numerous, different and unique? The answer underlying the mainstream hydrological research of the last two decades seems to be negative. Hopes were invested to the power of computers that would enable faithful and detailed representation of the diverse system elements and the hydrological processes, based on merely "first principles" and resulting in "physically-based" models that tend to approach in complexity the real world systems. Today the account of such research endeavour seems not positive, as it did not improve model predictive capacity and processes comprehension. A return to parsimonious modelling seems to be again the promising route. The experience from recent research and from comparisons of parsimonious and complicated models indicates that the former can facilitate insight and comprehension, improve accuracy and predictive capacity, and increase efficiency. In addition - and despite aspiration that "physically based" models will have lower data requirements and, even, they ultimately become "data-free" - parsimonious models require fewer data to achieve the same accuracy with more complicated models. Naturally, the concepts that reconcile the simplicity of parsimonious models with the complexity of hydrological systems are probability theory and statistics. Probability theory provides the theoretical basis for moving from a microscopic to a macroscopic view of phenomena, by mapping sets of diverse elements and events of hydrological systems to single numbers (a probability or an expected value), and statistics provides the empirical basis of summarizing data, making inference from them, and supporting decision making in water resource management. Unfortunately, the current state of the art in probability, statistics and their union, often called stochastics, is not fully satisfactory for the needs of modelling of hydrological and water resource systems. A first problem is that stochastic modelling has traditionally relied on classical statistics, which is based on the independent "coin-tossing" prototype, rather than on the study of real-world systems whose behaviour is very different from the classical prototype. A second problem is that the stochastic models (particularly the multivariate ones) are often not parsimonious themselves. Therefore, substantial advancement of stochastics is necessary in a new paradigm of parsimonious hydrological modelling. These ideas are illustrated using several examples, namely: (a) hydrological modelling of a karst system in Bosnia and Herzegovina using three different approaches ranging from parsimonious to detailed "physically-based"; (b) parsimonious modelling of a peculiar modified catchment in Greece; (c) a stochastic approach that can replace parameter-excessive ARMA-type models with a generalized algorithm that produces any shape of autocorrelation function (consistent with the accuracy provided by the data) using a couple of parameters; (d) a multivariate stochastic approach which replaces a huge number of parameters estimated from data with coefficients estimated by the principle of maximum entropy; and (e) a parsimonious approach for decision making in multi-reservoir systems using a handful of parameters instead of thousands of decision variables.

  9. Physics Teachers and Students: A Statistical and Historical Analysis of Women

    NASA Astrophysics Data System (ADS)

    Gregory, Amanda

    2009-10-01

    Historically, women have been denied an education comparable to that available to men. Since women have been allowed into institutions of higher learning, they have been studying and earning physics degrees. The aim of this poster is to discuss the statistical relationship between the number of women enrolled in university physics programs and the number of female physics faculty members. Special care has been given to examining the statistical data in the context of the social climate at the time that these women were teaching or pursuing their education.

  10. A theory of stationarity and asymptotic approach in dissipative systems

    NASA Astrophysics Data System (ADS)

    Rubel, Michael Thomas

    2007-05-01

    The approximate dynamics of many physical phenomena, including turbulence, can be represented by dissipative systems of ordinary differential equations. One often turns to numerical integration to solve them. There is an incompatibility, however, between the answers it can produce (i.e., specific solution trajectories) and the questions one might wish to ask (e.g., what behavior would be typical in the laboratory?) To determine its outcome, numerical integration requires more detailed initial conditions than a laboratory could normally provide. In place of initial conditions, experiments stipulate how tests should be carried out: only under statistically stationary conditions, for example, or only during asymptotic approach to a final state. Stipulations such as these, rather than initial conditions, are what determine outcomes in the laboratory.This theoretical study examines whether the points of view can be reconciled: What is the relationship between one's statistical stipulations for how an experiment should be carried out--stationarity or asymptotic approach--and the expected results? How might those results be determined without invoking initial conditions explicitly?To answer these questions, stationarity and asymptotic approach conditions are analyzed in detail. Each condition is treated as a statistical constraint on the system--a restriction on the probability density of states that might be occupied when measurements take place. For stationarity, this reasoning leads to a singular, invariant probability density which is already familiar from dynamical systems theory. For asymptotic approach, it leads to a new, more regular probability density field. A conjecture regarding what appears to be a limit relationship between the two densities is presented.By making use of the new probability densities, one can derive output statistics directly, avoiding the need to create or manipulate initial data, and thereby avoiding the conceptual incompatibility mentioned above. This approach also provides a clean way to derive reduced-order models, complete with local and global error estimates, as well as a way to compare existing reduced-order models objectively.The new approach is explored in the context of five separate test problems: a trivial one-dimensional linear system, a damped unforced linear oscillator in two dimensions, the isothermal Rayleigh-Plesset equation, Lorenz's equations, and the Stokes limit of Burgers' equation in one space dimension. In each case, various output statistics are deduced without recourse to initial conditions. Further, reduced-order models are constructed for asymptotic approach of the damped unforced linear oscillator, the isothermal Rayleigh-Plesset system, and Lorenz's equations, and for stationarity of Lorenz's equations.

  11. Coagulation-fragmentation for a finite number of particles and application to telomere clustering in the yeast nucleus

    NASA Astrophysics Data System (ADS)

    Hozé, Nathanaël; Holcman, David

    2012-01-01

    We develop a coagulation-fragmentation model to study a system composed of a small number of stochastic objects moving in a confined domain, that can aggregate upon binding to form local clusters of arbitrary sizes. A cluster can also dissociate into two subclusters with a uniform probability. To study the statistics of clusters, we combine a Markov chain analysis with a partition number approach. Interestingly, we obtain explicit formulas for the size and the number of clusters in terms of hypergeometric functions. Finally, we apply our analysis to study the statistical physics of telomeres (ends of chromosomes) clustering in the yeast nucleus and show that the diffusion-coagulation-fragmentation process can predict the organization of telomeres.

  12. Chemical freezeout parameters within generic nonextensive statistics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel; Yassin, Hayam; Abo Elyazeed, Eman R.

    2018-06-01

    The particle production in relativistic heavy-ion collisions seems to be created in a dynamically disordered system which can be best described by an extended exponential entropy. In distinguishing between the applicability of this and Boltzmann-Gibbs (BG) in generating various particle-ratios, generic (non)extensive statistics is introduced to the hadron resonance gas model. Accordingly, the degree of (non)extensivity is determined by the possible modifications in the phase space. Both BG extensivity and Tsallis nonextensivity are included as very special cases defined by specific values of the equivalence classes (c, d). We found that the particle ratios at energies ranging between 3.8 and 2760 GeV are best reproduced by nonextensive statistics, where c and d range between ˜ 0.9 and ˜ 1 . The present work aims at illustrating that the proposed approach is well capable to manifest the statistical nature of the system on interest. We don't aim at highlighting deeper physical insights. In other words, while the resulting nonextensivity is neither BG nor Tsallis, the freezeout parameters are found very compatible with BG and accordingly with the well-known freezeout phase-diagram, which is in an excellent agreement with recent lattice calculations. We conclude that the particle production is nonextensive but should not necessarily be accompanied by a radical change in the intensive or extensive thermodynamic quantities, such as internal energy and temperature. Only, the two critical exponents defining the equivalence classes (c, d) are the physical parameters characterizing the (non)extensivity.

  13. An Object-Oriented Approach for Analyzing CALIPSO's Profile Observations

    NASA Astrophysics Data System (ADS)

    Trepte, C. R.

    2016-12-01

    The CALIPSO satellite mission is a pioneering international partnership between NASA and the French Space Agency, CNES. Since launch on 28 April 2006, CALIPSO has been acquiring near-continuous lidar profile observations of clouds and aerosols in the Earth's atmosphere. Many studies have profitably used these observations to advance our understanding of climate, weather and air quality. For the most part, however, these studies have considered CALIPSO profile measurements independent from one another and have not related each to neighboring or family observations within a cloud element or aerosol feature. In this presentation we describe an alternative approach that groups measurements into objects visually identified from CALIPSO browse images. The approach makes use of the Visualization of CALIPSO (VOCAL) software tool that enables a user to outline a region of interest and save coordinates into a database. The selected features or objects can then be analyzed to explore spatial correlations over the feature's domain and construct bulk statistical properties for each structure. This presentation will show examples that examine cirrus and dust layers and will describe how this object-oriented approach can provide added insight into physical processes beyond conventional statistical treatments. It will further show results with combined measurements from other A-Train sensors to highlight advantages of viewing features in this manner.

  14. Childhood physical, environmental, and genetic predictors of adult hypertension: the cardiovascular risk in young Finns study.

    PubMed

    Juhola, Jonna; Oikonen, Mervi; Magnussen, Costan G; Mikkilä, Vera; Siitonen, Niina; Jokinen, Eero; Laitinen, Tomi; Würtz, Peter; Gidding, Samuel S; Taittonen, Leena; Seppälä, Ilkka; Jula, Antti; Kähönen, Mika; Hutri-Kähönen, Nina; Lehtimäki, Terho; Viikari, Jorma S A; Juonala, Markus; Raitakari, Olli T

    2012-07-24

    Hypertension is a major modifiable cardiovascular risk factor. The present longitudinal study aimed to examine the best combination of childhood physical and environmental factors to predict adult hypertension and furthermore whether newly identified genetic variants for blood pressure increase the prediction of adult hypertension. The study cohort included 2625 individuals from the Cardiovascular Risk in Young Finns Study who were followed up for 21 to 27 years since baseline (1980; age, 3-18 years). In addition to dietary factors and biomarkers related to blood pressure, we examined whether a genetic risk score based on 29 newly identified single-nucleotide polymorphisms enhances the prediction of adult hypertension. Hypertension in adulthood was defined as systolic blood pressure ≥ 130 mm Hg and/or diastolic blood pressure ≥ 85 mm Hg or medication for the condition. Independent childhood risk factors for adult hypertension included the individual's own blood pressure (P<0.0001), parental hypertension (P<0.0001), childhood overweight/obesity (P=0.005), low parental occupational status (P=0.003), and high genetic risk score (P<0.0001). Risk assessment based on childhood overweight/obesity status, parental hypertension, and parental occupational status was superior in predicting hypertension compared with the approach using only data on childhood blood pressure levels (C statistics, 0.718 versus 0.733; P=0.0007). Inclusion of both parental hypertension history and data on novel genetic variants for hypertension further improved the C statistics (0.742; P=0.015). Prediction of adult hypertension was enhanced by taking into account known physical and environmental childhood risk factors, family history of hypertension, and novel genetic variants. A multifactorial approach may be useful in identifying children at high risk for adult hypertension.

  15. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007).

    PubMed

    Pueyo, Salvador

    2012-05-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach.

  16. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007)

    PubMed Central

    Pueyo, Salvador

    2012-01-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach. PMID:22837843

  17. Effectiveness of motivational interviewing for improving physical activity self-management for adults with type 2 diabetes: A review.

    PubMed

    Soderlund, Patricia Davern

    2018-03-01

    Objectives This review examines the effectiveness of motivational interviewing for physical activity self-management for adults diagnosed with diabetes mellitus type 2. Motivational interviewing is a patient centered individually tailored counseling intervention that aims to elicit a patient's own motivation for health behavior change. Review questions include (a) How have motivational interviewing methods been applied to physical activity interventions for adults with diabetes mellitus type 2? (b) What motivational interviewing approaches are associated with successful physical activity outcomes with diabetes mellitus 2? Methods Database searches used PubMed, CINAHL, and PsycINFO for the years 2000 to 2016. Criteria for inclusion was motivational interviewing used as the principal intervention in the tradition of Miller and Rollnick, measurement of physical activity, statistical significance reported for physical activity outcomes, quantitative research, and articles written in English. Results A total of nine studies met review criteria and four included motivational interviewing interventions associated with significant physical activity outcomes. Discussion Findings suggest motivational interviewing sessions should target a minimal number of self-management behaviors, be delivered by counselors proficient in motivational interviewing, and use motivational interviewing protocols with an emphasis placed either on duration or frequency of sessions.

  18. Biocultural approach of the association between maturity and physical activity in youth.

    PubMed

    Werneck, André O; Silva, Danilo R; Collings, Paul J; Fernandes, Rômulo A; Ronque, Enio R V; Coelho-E-Silva, Manuel J; Sardinha, Luís B; Cyrino, Edilson S

    2017-11-13

    To test the biocultural model through direct and indirect associations between biological maturation, adiposity, cardiorespiratory fitness, feelings of sadness, social relationships, and physical activity in adolescents. This was a cross-sectional study conducted with 1,152 Brazilian adolescents aged between 10 and 17 years. Somatic maturation was estimated through Mirwald's method (peak height velocity). Physical activity was assessed through Baecke questionnaire (occupational, leisure, and sport contexts). Body mass index, body fat (sum of skinfolds), cardiorespiratory fitness (20-m shuttle run test), self-perceptions of social relationship, and frequency of sadness feelings were obtained for statistical modeling. Somatic maturation is directly related to sport practice and leisure time physical activity only among girls (β=0.12, p<0.05 and β=0.09, respectively, p<0.05). Moreover, biological (adiposity and cardiorespiratory fitness), psychological (sadness), and social (satisfaction with social relationships) variables mediated the association between maturity and physical activity in boys and for occupational physical activity in girls. In general, models presented good fit coefficients. Biocultural model presents good fit and emotional/biological factors mediate part of the relationship between somatic maturation and physical activity. Copyright © 2017 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  19. Low-complexity stochastic modeling of wall-bounded shear flows

    NASA Astrophysics Data System (ADS)

    Zare, Armin

    Turbulent flows are ubiquitous in nature and they appear in many engineering applications. Transition to turbulence, in general, increases skin-friction drag in air/water vehicles compromising their fuel-efficiency and reduces the efficiency and longevity of wind turbines. While traditional flow control techniques combine physical intuition with costly experiments, their effectiveness can be significantly enhanced by control design based on low-complexity models and optimization. In this dissertation, we develop a theoretical and computational framework for the low-complexity stochastic modeling of wall-bounded shear flows. Part I of the dissertation is devoted to the development of a modeling framework which incorporates data-driven techniques to refine physics-based models. We consider the problem of completing partially known sample statistics in a way that is consistent with underlying stochastically driven linear dynamics. Neither the statistics nor the dynamics are precisely known. Thus, our objective is to reconcile the two in a parsimonious manner. To this end, we formulate optimization problems to identify the dynamics and directionality of input excitation in order to explain and complete available covariance data. For problem sizes that general-purpose solvers cannot handle, we develop customized optimization algorithms based on alternating direction methods. The solution to the optimization problem provides information about critical directions that have maximal effect in bringing model and statistics in agreement. In Part II, we employ our modeling framework to account for statistical signatures of turbulent channel flow using low-complexity stochastic dynamical models. We demonstrate that white-in-time stochastic forcing is not sufficient to explain turbulent flow statistics and develop models for colored-in-time forcing of the linearized Navier-Stokes equations. We also examine the efficacy of stochastically forced linearized NS equations and their parabolized equivalents in the receptivity analysis of velocity fluctuations to external sources of excitation as well as capturing the effect of the slowly-varying base flow on streamwise streaks and Tollmien-Schlichting waves. In Part III, we develop a model-based approach to design surface actuation of turbulent channel flow in the form of streamwise traveling waves. This approach is capable of identifying the drag reducing trends of traveling waves in a simulation-free manner. We also use the stochastically forced linearized NS equations to examine the Reynolds number independent effects of spanwise wall oscillations on drag reduction in turbulent channel flows. This allows us to extend the predictive capability of our simulation-free approach to high Reynolds numbers.

  20. Learning from physics-based earthquake simulators: a minimal approach

    NASA Astrophysics Data System (ADS)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2017-04-01

    Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.

  1. A dynamical model of plasma turbulence in the solar wind

    PubMed Central

    Howes, G. G.

    2015-01-01

    A dynamical approach, rather than the usual statistical approach, is taken to explore the physical mechanisms underlying the nonlinear transfer of energy, the damping of the turbulent fluctuations, and the development of coherent structures in kinetic plasma turbulence. It is argued that the linear and nonlinear dynamics of Alfvén waves are responsible, at a very fundamental level, for some of the key qualitative features of plasma turbulence that distinguish it from hydrodynamic turbulence, including the anisotropic cascade of energy and the development of current sheets at small scales. The first dynamical model of kinetic turbulence in the weakly collisional solar wind plasma that combines self-consistently the physics of Alfvén waves with the development of small-scale current sheets is presented and its physical implications are discussed. This model leads to a simplified perspective on the nature of turbulence in a weakly collisional plasma: the nonlinear interactions responsible for the turbulent cascade of energy and the formation of current sheets are essentially fluid in nature, while the collisionless damping of the turbulent fluctuations and the energy injection by kinetic instabilities are essentially kinetic in nature. PMID:25848075

  2. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis.

    PubMed

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.

  3. Service innovation: a comparison of two approaches for physical screening of psychiatric inpatients.

    PubMed

    Harrison, Mark Richard; McMillan, Catherine Frances; Dickinson, Timothy

    2012-06-01

    Psychiatric medications have clear links to obesity, diabetes, dyslipidaemia, hypertension, hyperprolactinaemia and movement disorders. These disorders are a common cause of morbidity and mortality in psychiatric patients but physical screening by health services is often haphazard. We report the findings of an audit of physical screening across two hospital wards. Each ward undertook a process of service improvement. One ward modified the admissions proforma and the other developed a discharge screening clinic. The effectiveness of each of these interventions was then compared through a reaudit of practice across both wards. At baseline, screening was performed inconsistently and infrequently. On average, the modified admissions proforma increased screening rates by 4.7% compared to 30.7% for discharge screening clinics. The discharge screening clinic demonstrated statistically significant improvements in screening rates and effectively delivered health promotion advice. Discharge screening clinics are significantly more likely than improved admissions procedures to detect clinically significant abnormalities. If these abnormalities are detected and treated then the long-term physical health of psychiatric patients may be improved.

  4. Device-independent point estimation from finite data and its application to device-independent property estimation

    NASA Astrophysics Data System (ADS)

    Lin, Pei-Sheng; Rosset, Denis; Zhang, Yanbao; Bancal, Jean-Daniel; Liang, Yeong-Cherng

    2018-03-01

    The device-independent approach to physics is one where conclusions are drawn directly from the observed correlations between measurement outcomes. In quantum information, this approach allows one to make strong statements about the properties of the underlying systems or devices solely via the observation of Bell-inequality-violating correlations. However, since one can only perform a finite number of experimental trials, statistical fluctuations necessarily accompany any estimation of these correlations. Consequently, an important gap remains between the many theoretical tools developed for the asymptotic scenario and the experimentally obtained raw data. In particular, a physical and concurrently practical way to estimate the underlying quantum distribution has so far remained elusive. Here, we show that the natural analogs of the maximum-likelihood estimation technique and the least-square-error estimation technique in the device-independent context result in point estimates of the true distribution that are physical, unique, computationally tractable, and consistent. They thus serve as sound algorithmic tools allowing one to bridge the aforementioned gap. As an application, we demonstrate how such estimates of the underlying quantum distribution can be used to provide, in certain cases, trustworthy estimates of the amount of entanglement present in the measured system. In stark contrast to existing approaches to device-independent parameter estimations, our estimation does not require the prior knowledge of any Bell inequality tailored for the specific property and the specific distribution of interest.

  5. Collective Phase in Resource Competition in a Highly Diverse Ecosystem.

    PubMed

    Tikhonov, Mikhail; Monasson, Remi

    2017-01-27

    Organisms shape their own environment, which in turn affects their survival. This feedback becomes especially important for communities containing a large number of species; however, few existing approaches allow studying this regime, except in simulations. Here, we use methods of statistical physics to analytically solve a classic ecological model of resource competition introduced by MacArthur in 1969. We show that the nonintuitive phenomenology of highly diverse ecosystems includes a phase where the environment constructed by the community becomes fully decoupled from the outside world.

  6. A multi-resolution approach for optimal mass transport

    NASA Astrophysics Data System (ADS)

    Dominitz, Ayelet; Angenent, Sigurd; Tannenbaum, Allen

    2007-09-01

    Optimal mass transport is an important technique with numerous applications in econometrics, fluid dynamics, automatic control, statistical physics, shape optimization, expert systems, and meteorology. Motivated by certain problems in image registration and medical image visualization, in this note, we describe a simple gradient descent methodology for computing the optimal L2 transport mapping which may be easily implemented using a multiresolution scheme. We also indicate how the optimal transport map may be computed on the sphere. A numerical example is presented illustrating our ideas.

  7. Innovative Multimodal Physical Therapy Reduces Incidence of Repeat Manipulation under Anesthesia in Post-Total Knee Arthroplasty Patients Who Had an Initial Manipulation under Anesthesia.

    PubMed

    Chughtai, Morad; McGinn, Tanner; Bhave, Anil; Khan, Sabahat; Vashist, Megha; Khlopas, Anton; Mont, Michael A

    2016-11-01

    Manipulation under anesthesia (MUA) is performed for knee stiffness following a total knee arthroplasty (TKA) when nonoperative treatments fail. It is important to develop an optimal outpatient physical therapy protocol following an MUA, to avoid a repeat procedure. The purpose of this study was to evaluate and compare: (1) range of motion and (2) the rate of repeat MUA in patients who either underwent innovative multimodal physical therapy (IMMPT) or standard-of-care physical therapy (standard) following an MUA after a TKA. We performed a retrospective database study of patients who underwent an MUA following a TKA between January 2013 to December 2014 ( N  = 57). There were 16 (28%) men and 41 (72%) women who had a mean age of 59 years (range, 32-81 years). The patients were stratified into those who underwent IMMPT ( n  = 22) and those who underwent standard physical therapy ( n  = 35). The 6-month range of motion and rate of repeat manipulation between the two cohorts was analyzed by using Student t-test and Chi-square tests. In addition, we performed a Kaplan-Meier analysis of time to repeat MUA. The IMMPT cohort had a statistically significant higher proportion of TKAs with an optimal range of motion as compared with the standard cohort. There was statistically significant lower proportion of patients who underwent a repeat MUA in the IMMPT as compared with the standard cohort. There was also a significantly lower incidence and longer time to MUA in the IMMPT cohort as compared with the standard cohort in the Kaplan-Meier analysis. The group who underwent IMMPT utilizing Astym therapy had a significantly higher proportion of patients with optimal range of motion, which implies the potential efficacy of this regimen to improve range of motion. Furthermore, the IMMPT cohort had a significantly lower proportion of repeat manipulations as compared with the standard cohort, which implies that an IMMPT approach could potentially reduce the need for a repeat MUA. These findings warrant further investigation into outcomes of different rehab approaches. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  8. Improving nutrition and physical activity in the workplace: a meta-analysis of intervention studies.

    PubMed

    Hutchinson, Amanda D; Wilson, Carlene

    2012-06-01

    A comprehensive search of the literature for studies examining physical activity or nutrition interventions in the workplace, published between 1999 and March 2009, was conducted. This search identified 29 relevant studies. Interventions were grouped according to the theoretical framework on which the interventions were based (e.g. education, cognitive-behavioural, motivation enhancement, social influence, exercise). Weighted Cohen's d effect sizes, percentage overlap statistics, confidence intervals and fail safe Ns were calculated. Most theoretical approaches were associated with small effects. However, large effects were found for some measures of interventions using motivation enhancement. Effect sizes were larger for studies focusing on one health behaviour and for randomized controlled trials. The workplace is a suitable environment for making modest changes in the physical activity, nutrition and health of employees. Further research is necessary to determine whether these changes can be maintained in the long term.

  9. A genetically informed study of the association between harsh punishment and offspring behavioral problems.

    PubMed

    Lynch, Stacy K; Turkheimer, Eric; D'Onofrio, Brian M; Mendle, Jane; Emery, Robert E; Slutske, Wendy S; Martin, Nicholas G

    2006-06-01

    Conclusions about the effects of harsh parenting on children have been limited by research designs that cannot control for genetic or shared environmental confounds. The present study used a sample of children of twins and a hierarchical linear modeling statistical approach to analyze the consequences of varying levels of punishment while controlling for many confounding influences. The sample of 887 twin pairs and 2,554 children came from the Australian Twin Registry. Although corporal punishment per se did not have significant associations with negative childhood outcomes, harsher forms of physical punishment did appear to have specific and significant effects. The observed association between harsh physical punishment and negative outcomes in children survived a relatively rigorous test of its causal status, thereby increasing the authors' conviction that harsh physical punishment is a serious risk factor for children. ((c) 2006 APA, all rights reserved).

  10. A Genetically Informed Study of the Association Between Harsh Punishment and Offspring Behavioral Problems

    PubMed Central

    Lynch, Stacy K.; Turkheimer, Eric; D’Onofrio, Brian M.; Mendle, Jane; Emery, Robert E.; Slutske, Wendy S.; Martin, Nicholas G.

    2010-01-01

    Conclusions about the effects of harsh parenting on children have been limited by research designs that cannot control for genetic or shared environmental confounds. The present study used a sample of children of twins and a hierarchical linear modeling statistical approach to analyze the consequences of varying levels of punishment while controlling for many confounding influences. The sample of 887 twin pairs and 2,554 children came from the Australian Twin Registry. Although corporal punishment per se did not have significant associations with negative childhood outcomes, harsher forms of physical punishment did appear to have specific and significant effects. The observed association between harsh physical punishment and negative outcomes in children survived a relatively rigorous test of its causal status, thereby increasing the authors’ conviction that harsh physical punishment is a serious risk factor for children. PMID:16756394

  11. Assessing Coupled Social Ecological Flood Vulnerability from Uttarakhand, India, to the State of New York with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Schwarz, B.

    2014-12-01

    This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.

  12. Microbial composition analyses by 16S rRNA sequencing: A proof of concept approach to provenance determination of archaeological ochre.

    PubMed

    Lenehan, Claire E; Tobe, Shanan S; Smith, Renee J; Popelka-Filcoff, Rachel S

    2017-01-01

    Many archaeological science studies use the concept of "provenance", where the origins of cultural material can be determined through physical or chemical properties that relate back to the origins of the material. Recent studies using DNA profiling of bacteria have been used for the forensic determination of soils, towards determination of geographic origin. This manuscript presents a novel approach to the provenance of archaeological minerals and related materials through the use of 16S rRNA sequencing analysis of microbial DNA. Through the microbial DNA characterization from ochre and multivariate statistics, we have demonstrated the clear discrimination between four distinct Australian cultural ochre sites.

  13. Partitioning a macroscopic system into independent subsystems

    NASA Astrophysics Data System (ADS)

    Delle Site, Luigi; Ciccotti, Giovanni; Hartmann, Carsten

    2017-08-01

    We discuss the problem of partitioning a macroscopic system into a collection of independent subsystems. The partitioning of a system into replica-like subsystems is nowadays a subject of major interest in several fields of theoretical and applied physics. The thermodynamic approach currently favoured by practitioners is based on a phenomenological definition of an interface energy associated with the partition, due to a lack of easily computable expressions for a microscopic (i.e. particle-based) interface energy. In this article, we outline a general approach to derive sharp and computable bounds for the interface free energy in terms of microscopic statistical quantities. We discuss potential applications in nanothermodynamics and outline possible future directions.

  14. Performance estimation for threat detection in CT systems

    NASA Astrophysics Data System (ADS)

    Montgomery, Trent; Karl, W. Clem; Castañón, David A.

    2017-05-01

    Detecting the presence of hazardous materials in suitcases and carry-on luggage is an important problem in aviation security. As the set of threats is expanding, there is a corresponding need to increase the capabilities of explosive detection systems to address these threats. However, there is a lack of principled tools for predicting the performance of alternative designs for detection systems. In this paper, we describe an approach for computing bounds on the achievable classification performance of material discrimination systems based on empirical statistics that estimate the f-divergence of the underlying features. Our approach can be used to examine alternative physical observation modalities and measurement configurations, as well as variations in reconstruction and feature extraction algorithms.

  15. Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals

    PubMed Central

    Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.

    2016-01-01

    Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116

  16. Mean-field approximation for spacing distribution functions in classical systems.

    PubMed

    González, Diego Luis; Pimpinelli, Alberto; Einstein, T L

    2012-01-01

    We propose a mean-field method to calculate approximately the spacing distribution functions p((n))(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p((n))(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed. © 2012 American Physical Society

  17. Quality-of-Life Outcomes of Patients following Patellofemoral Stabilization Surgery: The Influence of Trochlear Dysplasia.

    PubMed

    Hiemstra, Laurie Anne; Kerslake, Sarah; Lafave, Mark R

    2017-11-01

    Trochlear dysplasia is a well-described risk factor for recurrent patellofemoral instability. Despite its clear association with the incidence of patellofemoral instability, it is unclear whether the presence of high-grade trochlear dysplasia influences clinical outcome after patellofemoral stabilization. The purpose of this study was to assess whether trochlear dysplasia influenced patient-reported, disease-specific outcomes in surgically treated patellar instability patients, when risk factors were addressed in accordance with the à la carte surgical approach to the treatment of patellofemoral instability. The study design is of a case series. A total of 318 patellar stabilization procedures were performed during the study period. Of these procedures, 260 had adequate lateral radiographs and complete Banff Patellar Instability Instrument (BPII) scores available for assessment. A Pearson r correlation was calculated between four characteristics of trochlear dysplasia, the BPII total and the BPII symptoms, and physical complaints scores, a mean of 24 months following patellofemoral stabilization. Independent t -tests were performed between stratified trochlear dysplasia groups (no/low grade and high grade) and all BPII measures. There was a statistically significant correlation between measures of trochlear dysplasia and quality-of-life physical symptoms scores, an average of 2 years following patellofemoral stabilization surgery. The BPII symptoms and physical complaints domain score, as well as the individual weakness and stiffness questions, correlated with the classification of trochlear dysplasia as well as the presence of a trochlear bump ( p  < 0.05). Independent t -tests demonstrated statistically significant differences between the no/low-grade and high-grade dysplasia groups for the BPII stiffness ( p  = 0.002), BPII weakness ( p  = 0.05) and BPII symptom, and physical complaints values ( p  = 0.04). Two additional measures-the 24-month postoperative total BPII score ( p  = 0.11) and BPII pain score ( p  = 0.07)-demonstrated trends toward statistical significance. This research has established a statistically significant correlation between trochlear dysplasia and disease-specific quality-of-life outcomes following patellofemoral stabilization surgery. There was a significant correlation between patient-reported physical symptoms after surgery and high-grade trochlear dysplasia. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  18. To b or not to b ?? A nonextensive view of b-value in the Gutenberg-Richter law.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2014-05-01

    The Gutenberg-Richter (GR) (Gutenberg and Richter, 1944) law one of the cornerstones of modern seismology has been considered as a paradigm of manifestation of self-organized criticality since the dependence of the cumulative number of earthquakes with energy, i.e., the number of earthquakes with energy greater than E, behaves as a power law with the b value related to the critical exponent. A great number of seismic hazard studies have been originated as a result of this law. The Gutenberg-Richter (GR) law is an empirical relationship, which recent efforts relate it with general physical principles (Kagan and Knopoff, 1981; Wesnousky, 1999; Sarlis et al., 2010; Telesca, 2012; Vallianatos and Sammonds, 2013). Nonextensive statistical mechanics pioneered by Tsallis (Tsallis, 2009) provides a consistent theoretical framework for the studies of complex systems in their nonequilibrium stationary states, systems with multi fractal and self-similar structures, long-range interacting systems, etc. Earth is such system. In the present work we analyze the different pathways (originated in Sotolongo-Costa, A. Posadas , 2004; Silva et al., 2006) to extract the generalization of the G-R law as obtained in the frame of non extensive statistical physics. We estimate the b-value and we discuss its underline physics. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project of the "Education & Lifelong Learning" Operational Programme. References Gutenberg, B. and C. F. Richter (1944). Bull. Seismol. Soc. Am. 34, 185-188. Kagan, Y. Y. and L. Knopoff (1981). J. Geophys. Res. 86, 2853-2862. Sarlis, N., E. Skordas and P. Varotsos (2010). Physical Review E - Statistical, Nonlinear, and Soft Matter Physics 82 (2) , 021110. Silva, R., G. Franca, C. Vilar and J. Alcaniz (2006). Phys. Rev. E, 73, 026102 Sotolongo-Costa, O. and A. Posadas (2004). Phys. Rev. Lett., 92, 048501 Telesca, L. (2012). Bull. Seismol. Soc. Amer., 102,886-891. Tsallis, C. (2009). Introduction to Nonextensive Statistical Mechanics, Approaching a Complex World Springer, New York Vallianatos, F. and P. Sammonds, (2013). Tectonophysics 590, 52-58 Wesnousky, S. G. (1999). Bull. Seismol. Soc. Am. 89, 1131-1137.

  19. Testing the Self-Consistency of the Excursion Set Approach to Predicting the Dark Matter Halo Mass Function

    NASA Astrophysics Data System (ADS)

    Achitouv, I.; Rasera, Y.; Sheth, R. K.; Corasaniti, P. S.

    2013-12-01

    The excursion set approach provides a framework for predicting how the abundance of dark matter halos depends on the initial conditions. A key ingredient of this formalism is the specification of a critical overdensity threshold (barrier) which protohalos must exceed if they are to form virialized halos at a later time. However, to make its predictions, the excursion set approach explicitly averages over all positions in the initial field, rather than the special ones around which halos form, so it is not clear that the barrier has physical motivation or meaning. In this Letter we show that once the statistical assumptions which underlie the excursion set approach are considered a drifting diffusing barrier model does provide a good self-consistent description both of halo abundance as well as of the initial overdensities of the protohalo patches.

  20. Physics-based statistical model and simulation method of RF propagation in urban environments

    DOEpatents

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  1. Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Srivastava, Askok N.; Matthews, Bryan; Das, Santanu

    2008-01-01

    The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.

  2. NASA GPM GV Science Requirements

    NASA Technical Reports Server (NTRS)

    Smith, E.

    2003-01-01

    An important scientific objective of the NASA portion of the GPM Mission is to generate quantitatively-based error characterization information along with the rainrate retrievals emanating from the GPM constellation of satellites. These data must serve four main purposes: (1) they must be of sufficient quality, uniformity, and timeliness to govern the observation weighting schemes used in the data assimilation modules of numerical weather prediction models; (2) they must extend over that portion of the globe accessible by the GPM core satellite to which the NASA GV program is focused - (approx.65 degree inclination); (3) they must have sufficient specificity to enable detection of physically-formulated microphysical and meteorological weaknesses in the standard physical level 2 rainrate algorithms to be used in the GPM Precipitation Processing System (PPS), i.e., algorithms which will have evolved from the TRMM standard physical level 2 algorithms; and (4) they must support the use of physical error modeling as a primary validation tool and as the eventual replacement of the conventional GV approach of statistically intercomparing surface rainrates fiom ground and satellite measurements. This approach to ground validation research represents a paradigm shift vis-&-vis the program developed for the TRMM mission, which conducted ground validation largely as a statistical intercomparison process between raingauge-derived or radar-derived rainrates and the TRMM satellite rainrate retrievals -- long after the original satellite retrievals were archived. This approach has been able to quantify averaged rainrate differences between the satellite algorithms and the ground instruments, but has not been able to explain causes of algorithm failures or produce error information directly compatible with the cost functions of data assimilation schemes. These schemes require periodic and near-realtime bias uncertainty (i.e., global space-time distributed conditional accuracy of the retrieved rainrates) and local error covariance structure (i.e., global space-time distributed error correlation information for the local 4-dimensional space-time domain -- or in simpler terms, the matrix form of precision error). This can only be accomplished by establishing a network of high quality-heavily instrumented supersites selectively distributed at a few oceanic, continental, and coastal sites. Economics and pragmatics dictate that the network must be made up of a relatively small number of sites (6-8) created through international cooperation. This presentation will address some of the details of the methodology behind the error characterization approach, some proposed solutions for expanding site-developed error properties to regional scales, a data processing and communications concept that would enable rapid implementation of algorithm improvement by the algorithm developers, and the likely available options for developing the supersite network.

  3. A new-old approach for shallow landslide analysis and susceptibility zoning in fine-grained weathered soils of southern Italy

    NASA Astrophysics Data System (ADS)

    Cascini, Leonardo; Ciurleo, Mariantonietta; Di Nocera, Silvio; Gullà, Giovanni

    2015-07-01

    Rainfall-induced shallow landslides involve several geo-environmental contexts and different types of soils. In clayey soils, they affect the most superficial layer, which is generally constituted by physically weathered soils characterised by a diffuse pattern of cracks. This type of landslide most commonly occurs in the form of multiple-occurrence landslide phenomena simultaneously involving large areas and thus has several consequences in terms of environmental and economic damage. Indeed, landslide susceptibility zoning is a relevant issue for land use planning and/or design purposes. This study proposes a multi-scale approach to reach this goal. The proposed approach is tested and validated over an area in southern Italy affected by widespread shallow landslides that can be classified as earth slides and earth slide-flows. Specifically, by moving from a small (1:100,000) to a medium scale (1:25,000), with the aid of heuristic and statistical methods, the approach identifies the main factors leading to landslide occurrence and effectively detects the areas potentially affected by these phenomena. Finally, at a larger scale (1:5000), deterministic methods, i.e., physically based models (TRIGRS and TRIGRS-unsaturated), allow quantitative landslide susceptibility assessment, starting from sample areas representative of those that can be affected by shallow landslides. Considering the reliability of the obtained results, the proposed approach seems useful for analysing other case studies in similar geological contexts.

  4. Statistical inference approach to structural reconstruction of complex networks from binary time series

    NASA Astrophysics Data System (ADS)

    Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng

    2018-02-01

    Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.

  5. Modelling nitrate pollution pressure using a multivariate statistical approach: the case of Kinshasa groundwater body, Democratic Republic of Congo

    NASA Astrophysics Data System (ADS)

    Mfumu Kihumba, Antoine; Ndembo Longo, Jean; Vanclooster, Marnik

    2016-03-01

    A multivariate statistical modelling approach was applied to explain the anthropogenic pressure of nitrate pollution on the Kinshasa groundwater body (Democratic Republic of Congo). Multiple regression and regression tree models were compared and used to identify major environmental factors that control the groundwater nitrate concentration in this region. The analyses were made in terms of physical attributes related to the topography, land use, geology and hydrogeology in the capture zone of different groundwater sampling stations. For the nitrate data, groundwater datasets from two different surveys were used. The statistical models identified the topography, the residential area, the service land (cemetery), and the surface-water land-use classes as major factors explaining nitrate occurrence in the groundwater. Also, groundwater nitrate pollution depends not on one single factor but on the combined influence of factors representing nitrogen loading sources and aquifer susceptibility characteristics. The groundwater nitrate pressure was better predicted with the regression tree model than with the multiple regression model. Furthermore, the results elucidated the sensitivity of the model performance towards the method of delineation of the capture zones. For pollution modelling at the monitoring points, therefore, it is better to identify capture-zone shapes based on a conceptual hydrogeological model rather than to adopt arbitrary circular capture zones.

  6. Statistical inference approach to structural reconstruction of complex networks from binary time series.

    PubMed

    Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng

    2018-02-01

    Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.

  7. Modelling the hydraulic conductivity of porous media using physical-statistical model

    NASA Astrophysics Data System (ADS)

    Usowicz, B.; Usowicz, L. B.; Lipiec, J.

    2009-04-01

    Soils and other porous media can be represented by a pattern (net) of more or less cylindrically interconnected channels. The capillary radius, r can represent an elementary capillary formed in between soil particles in one case, and in another case it can represent a mean hydrodynamic radius. When we view a porous medium as a net of interconnected capillaries, we can apply a statistical approach for the description of the liquid or gas flow. A soil phase is included in the porous medium and its configuration is decisive for pore distribution in this medium and hence, it conditions the course of the water retention curve of this medium. In this work method of estimating hydraulic conductivity of porous media based on physical-statistical model proposed by B. Usowicz is presented. The physical-statistical model considers the pore space as the capillary net. The net of capillary connections is represented by parallel and serial connections of hydraulic resistors in the layer and between the layers, respectively. The polynomial distribution was used in this model to determine probability of the occurrence of a given capillary configuration. The model was calibrated using measured water retention curve and two values of hydraulic conductivity saturated and unsaturated and model parameters were determined. The model was used for predicting hydraulic conductivity as a function of soil water content K(theta). The model was validated by comparing the measured and predicted K data for various soils and other porous media (e.g. sandstone). A good agreement between measured and predicted data was reasonable as indicated by values R2 (>0.9). It was also confirmed that the random variables used for the calculations and model parameters were chosen and selected correctly. The study was funded in part by the Polish Ministry of Science and Higher Education by Grant No. N305 046 31/1707).

  8. Scientific Data Services -- A High-Performance I/O System with Array Semantics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Byna, Surendra; Rotem, Doron

    2011-09-21

    As high-performance computing approaches exascale, the existing I/O system design is having trouble keeping pace in both performance and scalability. We propose to address this challenge by adopting database principles and techniques in parallel I/O systems. First, we propose to adopt an array data model because many scientific applications represent their data in arrays. This strategy follows a cardinal principle from database research, which separates the logical view from the physical layout of data. This high-level data model gives the underlying implementation more freedom to optimize the physical layout and to choose the most effective way of accessing the data.more » For example, knowing that a set of write operations is working on a single multi-dimensional array makes it possible to keep the subarrays in a log structure during the write operations and reassemble them later into another physical layout as resources permit. While maintaining the high-level view, the storage system could compress the user data to reduce the physical storage requirement, collocate data records that are frequently used together, or replicate data to increase availability and fault-tolerance. Additionally, the system could generate secondary data structures such as database indexes and summary statistics. We expect the proposed Scientific Data Services approach to create a “live” storage system that dynamically adjusts to user demands and evolves with the massively parallel storage hardware.« less

  9. Some Research Centers for Plasma Physics and Solid State Physics in the Netherlands and Belgium. Part II. Belgium,

    DTIC Science & Technology

    plasma column and observed the interesting phenomenon of plasma ejection. At FUB, Balescu and Prigogine direct a group of sixty theoreticians doing...outstanding work in statistical physics. Balescu is writing another graduate textbook on non-equilibrium statistical mechanics. He is tackling the

  10. Physics in Perspective Volume II, Part C, Statistical Data.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Physics Survey Committee.

    Statistical data relating to the sociology and economics of the physics enterprise are presented and explained. The data are divided into three sections: manpower data, data on funding and costs, and data on the literature of physics. Each section includes numerous studies, with notes on the sources and types of data, gathering procedures, and…

  11. Coupled disease-behavior dynamics on complex networks: A review.

    PubMed

    Wang, Zhen; Andrews, Michael A; Wu, Zhi-Xi; Wang, Lin; Bauch, Chris T

    2015-12-01

    It is increasingly recognized that a key component of successful infection control efforts is understanding the complex, two-way interaction between disease dynamics and human behavioral and social dynamics. Human behavior such as contact precautions and social distancing clearly influence disease prevalence, but disease prevalence can in turn alter human behavior, forming a coupled, nonlinear system. Moreover, in many cases, the spatial structure of the population cannot be ignored, such that social and behavioral processes and/or transmission of infection must be represented with complex networks. Research on studying coupled disease-behavior dynamics in complex networks in particular is growing rapidly, and frequently makes use of analysis methods and concepts from statistical physics. Here, we review some of the growing literature in this area. We contrast network-based approaches to homogeneous-mixing approaches, point out how their predictions differ, and describe the rich and often surprising behavior of disease-behavior dynamics on complex networks, and compare them to processes in statistical physics. We discuss how these models can capture the dynamics that characterize many real-world scenarios, thereby suggesting ways that policy makers can better design effective prevention strategies. We also describe the growing sources of digital data that are facilitating research in this area. Finally, we suggest pitfalls which might be faced by researchers in the field, and we suggest several ways in which the field could move forward in the coming years. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Fecal indicator organism modeling and microbial source tracking in environmental waters: Chapter 3.4.6

    USGS Publications Warehouse

    Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.

    2016-01-01

    Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.

  13. Worrying trends in econophysics

    NASA Astrophysics Data System (ADS)

    Gallegati, Mauro; Keen, Steve; Lux, Thomas; Ormerod, Paul

    2006-10-01

    Econophysics has already made a number of important empirical contributions to our understanding of the social and economic world. These fall mainly into the areas of finance and industrial economics, where in each case there is a large amount of reasonably well-defined data. More recently, Econophysics has also begun to tackle other areas of economics where data is much more sparse and much less reliable. In addition, econophysicists have attempted to apply the theoretical approach of statistical physics to try to understand empirical findings. Our concerns are fourfold. First, a lack of awareness of work that has been done within economics itself. Second, resistance to more rigorous and robust statistical methodology. Third, the belief that universal empirical regularities can be found in many areas of economic activity. Fourth, the theoretical models which are being used to explain empirical phenomena. The latter point is of particular concern. Essentially, the models are based upon models of statistical physics in which energy is conserved in exchange processes. There are examples in economics where the principle of conservation may be a reasonable approximation to reality, such as primitive hunter-gatherer societies. But in the industrialised capitalist economies, income is most definitely not conserved. The process of production and not exchange is responsible for this. Models which focus purely on exchange and not on production cannot by definition offer a realistic description of the generation of income in the capitalist, industrialised economies.

  14. Detailed Spectral Analysis of the 260 ks XMM-Newton Data of 1E 1207.4-5209 and Significance of a 2.1 keV Absorption Feature

    NASA Astrophysics Data System (ADS)

    Mori, Kaya; Chonko, James C.; Hailey, Charles J.

    2005-10-01

    We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.

  15. Hunting Solomonoff's Swans: Exploring the Boundary Between Physics and Statistics in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2014-12-01

    Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.

  16. Anatomy of the Higgs fits: A first guide to statistical treatments of the theoretical uncertainties

    NASA Astrophysics Data System (ADS)

    Fichet, Sylvain; Moreau, Grégory

    2016-04-01

    The studies of the Higgs boson couplings based on the recent and upcoming LHC data open up a new window on physics beyond the Standard Model. In this paper, we propose a statistical guide to the consistent treatment of the theoretical uncertainties entering the Higgs rate fits. Both the Bayesian and frequentist approaches are systematically analysed in a unified formalism. We present analytical expressions for the marginal likelihoods, useful to implement simultaneously the experimental and theoretical uncertainties. We review the various origins of the theoretical errors (QCD, EFT, PDF, production mode contamination…). All these individual uncertainties are thoroughly combined with the help of moment-based considerations. The theoretical correlations among Higgs detection channels appear to affect the location and size of the best-fit regions in the space of Higgs couplings. We discuss the recurrent question of the shape of the prior distributions for the individual theoretical errors and find that a nearly Gaussian prior arises from the error combinations. We also develop the bias approach, which is an alternative to marginalisation providing more conservative results. The statistical framework to apply the bias principle is introduced and two realisations of the bias are proposed. Finally, depending on the statistical treatment, the Standard Model prediction for the Higgs signal strengths is found to lie within either the 68% or 95% confidence level region obtained from the latest analyses of the 7 and 8 TeV LHC datasets.

  17. Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models.

    PubMed

    Hillier, John K; Kougioumtzoglou, Ioannis A; Stokes, Chris R; Smith, Michael J; Clark, Chris D; Spagnolo, Matteo S

    2016-01-01

    Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL). By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A 'stochastic instability' (SI) model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity). Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models.

  18. Proceedings of the Workshop on Change of Representation and Problem Reformulation

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.

    1992-01-01

    The proceedings of the third Workshop on Change of representation and Problem Reformulation is presented. In contrast to the first two workshops, this workshop was focused on analytic or knowledge-based approaches, as opposed to statistical or empirical approaches called 'constructive induction'. The organizing committee believes that there is a potential for combining analytic and inductive approaches at a future date. However, it became apparent at the previous two workshops that the communities pursuing these different approaches are currently interested in largely non-overlapping issues. The constructive induction community has been holding its own workshops, principally in conjunction with the machine learning conference. While this workshop is more focused on analytic approaches, the organizing committee has made an effort to include more application domains. We have greatly expanded from the origins in the machine learning community. Participants in this workshop come from the full spectrum of AI application domains including planning, qualitative physics, software engineering, knowledge representation, and machine learning.

  19. CALL FOR PAPERS: Progress in Supersymmetric Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    2003-11-01

    This is a call for contributions to a Special Issue of Journal of Physics A: Mathematical and General entitled `Statistical Physics of Disordered Systems: from Real Materials to Optimization and Codes'. This issue should be a place for high quality original work. We stress the fact that we are interested in having the topic interpreted broadly: we would like to have contributions ranging from equilibrium and dynamical studies of spin glasses, glassy behaviour in amorphous materials, and low temperature physics, to applications in non-conventional areas such as error correcting codes, image analysis and reconstruction, optimization, and algorithms based on statistical mechanical ideas. We believe that we have arrived at a very exciting moment for the development of this multidisciplinary approach, and that this issue will be of a high standard and prove to be a very useful tool in the future. The Editorial Board has invited E Marinari, H Nishimori and F Ricci-Tersenghi to serve as Guest Editors for the Special Issue. Their criteria for acceptance of contributions are the following: bullet The subject of the paper should relate to the statistical physics of disordered systems. bullet Contributions will be refereed and processed according to the usual procedures of the journal. bullet Papers should be original (they should not be simply reviews of authors' own work that is already published elsewhere). bullet Review articles will be considered for inclusion in the Special Issue only in very special cases. The editors will analyse potential proposals of reviews, and if needed they will ask for some review contributions. The guidelines for the preparation of contributions are the following: bullet The deadline for submission of contributions is 31 March 2003. This deadline will allow the Special Issue to appear in about October 2003. bullet There is a nominal page limit of 15 printed pages (approximately 9000 words) per research contribution. The contributions that have been approved by the Guest Editors as review articles will have a limit of 30 printed pages (18000 words). Papers exceeding these limits may be accepted at the discretion of the Guest Editors. Further advice on publishing your work in Journal of Physics A: Mathematical and General may be found at www.iop.org/Journals/jphysa. bullet Contributions to the Special Issue should if possible be submitted electronically at www.iop.org/Journals/jphysa or by e-mail to jphysa@iop.org, quoting `JPhysA Special Issue -- Statistical Physics of Disordered Systems'. Submissions should ideally be in either standard LaTeX form or Microsoft Word. Please see the web site for further information on electronic submissions. bullet Authors unable to submit electronically may send hard copy contributions to: Publishing Administrators, Journal of Physics A, Institute of Physics Publishing, Dirac House, Temple Back, Bristol BS1 6BE, UK, enclosing the electronic code on floppy disk if available and quoting `JPhysA Special Issue -- Statistical Physics of Disordered Systems'. bullet All contributions should be accompanied by a read-me file or covering letter giving the postal and e-mail addresses for correspondence. The Publishing Office should be notified of any subsequent change of address. This Special Issue will be published in both paper and online editions of the journal. The corresponding author of each contribution will receive a complimentary copy of the issue in addition to the usual 25 free offprints of their article. E Marinari, H Nishimori and F Ricci-Tersenghi Guest Editors

  20. A Mokken scale analysis of the peer physical examination questionnaire.

    PubMed

    Vaughan, Brett; Grace, Sandra

    2018-01-01

    Peer physical examination (PPE) is a teaching and learning strategy utilised in most health profession education programs. Perceptions of participating in PPE have been described in the literature, focusing on areas of the body students are willing, or unwilling, to examine. A small number of questionnaires exist to evaluate these perceptions, however none have described the measurement properties that may allow them to be used longitudinally. The present study undertook a Mokken scale analysis of the Peer Physical Examination Questionnaire (PPEQ) to evaluate its dimensionality and structure when used with Australian osteopathy students. Students enrolled in Year 1 of the osteopathy programs at Victoria University (Melbourne, Australia) and Southern Cross University (Lismore, Australia) were invited to complete the PPEQ prior to their first practical skills examination class. R, an open-source statistics program, was used to generate the descriptive statistics and perform a Mokken scale analysis. Mokken scale analysis is a non-parametric item response theory approach that is used to cluster items measuring a latent construct. Initial analysis suggested the PPEQ did not form a single scale. Further analysis identified three subscales: 'comfort', 'concern', and 'professionalism and education'. The properties of each subscale suggested they were unidimensional with variable internal structures. The 'comfort' subscale was the strongest of the three identified. All subscales demonstrated acceptable reliability estimation statistics (McDonald's omega > 0.75) supporting the calculation of a sum score for each subscale. The subscales identified are consistent with the literature. The 'comfort' subscale may be useful to longitudinally evaluate student perceptions of PPE. Further research is required to evaluate changes with PPE and the utility of the questionnaire with other health profession education programs.

  1. Effect of a Trampoline Exercise on the Anthropometric Measures and Motor Performance of Adolescent Students

    PubMed Central

    Aalizadeh, Bahman; Mohammadzadeh, Hassan; Khazani, Ali; Dadras, Ali

    2016-01-01

    Background: Physical exercises can influence some anthropometric and fitness components differently. The aim of present study was to evaluate how a relatively long-term training program in 11-14-year-old male Iranian students affects their anthropometric and motor performance measures. Methods: Measurements were conducted on the anthropometric and fitness components of participants (n = 28) prior to and following the program. They trained 20 weeks, 1.5 h/session with 10 min rest, in 4 times trampoline training programs per week. Motor performance of all participants was assessed using standing long jump and vertical jump based on Eurofit Test Battery. Results: The analysis of variance (ANOVA) repeated measurement test showed a statistically significant main effect of time in calf girth P = 0.001, fat% P = 0.01, vertical jump P = 0.001, and long jump P = 0.001. The ANOVA repeated measurement test revealed a statistically significant main effect of group in fat% P = 0.001. Post hoc paired t-tests indicated statistical significant differences in trampoline group between the two measurements about calf girth (t = −4.35, P = 0.001), fat% (t = 5.87, P = 0.001), vertical jump (t = −5.53, P = 0.001), and long jump (t = −10.00, P = 0.001). Conclusions: We can conclude that 20-week trampoline training with four physical activity sessions/week in 11–14-year-old students seems to have a significant effect on body fat% reduction and effective results in terms of anaerobic physical fitness. Therefore, it is suggested that different training model approach such as trampoline exercises can help students to promote the level of health and motor performance. PMID:27512557

  2. Effect of a Trampoline Exercise on the Anthropometric Measures and Motor Performance of Adolescent Students.

    PubMed

    Aalizadeh, Bahman; Mohammadzadeh, Hassan; Khazani, Ali; Dadras, Ali

    2016-01-01

    Physical exercises can influence some anthropometric and fitness components differently. The aim of present study was to evaluate how a relatively long-term training program in 11-14-year-old male Iranian students affects their anthropometric and motor performance measures. Measurements were conducted on the anthropometric and fitness components of participants (n = 28) prior to and following the program. They trained 20 weeks, 1.5 h/session with 10 min rest, in 4 times trampoline training programs per week. Motor performance of all participants was assessed using standing long jump and vertical jump based on Eurofit Test Battery. The analysis of variance (ANOVA) repeated measurement test showed a statistically significant main effect of time in calf girth P = 0.001, fat% P = 0.01, vertical jump P = 0.001, and long jump P = 0.001. The ANOVA repeated measurement test revealed a statistically significant main effect of group in fat% P = 0.001. Post hoc paired t-tests indicated statistical significant differences in trampoline group between the two measurements about calf girth (t = -4.35, P = 0.001), fat% (t = 5.87, P = 0.001), vertical jump (t = -5.53, P = 0.001), and long jump (t = -10.00, P = 0.001). We can conclude that 20-week trampoline training with four physical activity sessions/week in 11-14-year-old students seems to have a significant effect on body fat% reduction and effective results in terms of anaerobic physical fitness. Therefore, it is suggested that different training model approach such as trampoline exercises can help students to promote the level of health and motor performance.

  3. Statistical mechanics framework for static granular matter.

    PubMed

    Henkes, Silke; Chakraborty, Bulbul

    2009-06-01

    The physical properties of granular materials have been extensively studied in recent years. So far, however, there exists no theoretical framework which can explain the observations in a unified manner beyond the phenomenological jamming diagram. This work focuses on the case of static granular matter, where we have constructed a statistical ensemble which mirrors equilibrium statistical mechanics. This ensemble, which is based on the conservation properties of the stress tensor, is distinct from the original Edwards ensemble and applies to packings of deformable grains. We combine it with a field theoretical analysis of the packings, where the field is the Airy stress function derived from the force and torque balance conditions. In this framework, Point J characterized by a diverging stiffness of the pressure fluctuations. Separately, we present a phenomenological mean-field theory of the jamming transition, which incorporates the mean contact number as a variable. We link both approaches in the context of the marginal rigidity picture proposed by Wyart and others.

  4. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  5. On the statistical mechanics of species abundance distributions.

    PubMed

    Bowler, Michael G; Kelly, Colleen K

    2012-09-01

    A central issue in ecology is that of the factors determining the relative abundance of species within a natural community. The proper application of the principles of statistical physics to species abundance distributions (SADs) shows that simple ecological properties could account for the near universal features observed. These properties are (i) a limit on the number of individuals in an ecological guild and (ii) per capita birth and death rates. They underpin the neutral theory of Hubbell (2001), the master equation approach of Volkov et al. (2003, 2005) and the idiosyncratic (extreme niche) theory of Pueyo et al. (2007); they result in an underlying log series SAD, regardless of neutral or niche dynamics. The success of statistical mechanics in this application implies that communities are in dynamic equilibrium and hence that niches must be flexible and that temporal fluctuations on all sorts of scales are likely to be important in community structure. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Renormalization-group theory for finite-size scaling in extreme statistics

    NASA Astrophysics Data System (ADS)

    Györgyi, G.; Moloney, N. R.; Ozogány, K.; Rácz, Z.; Droz, M.

    2010-04-01

    We present a renormalization-group (RG) approach to explain universal features of extreme statistics applied here to independent identically distributed variables. The outlines of the theory have been described in a previous paper, the main result being that finite-size shape corrections to the limit distribution can be obtained from a linearization of the RG transformation near a fixed point, leading to the computation of stable perturbations as eigenfunctions. Here we show details of the RG theory which exhibit remarkable similarities to the RG known in statistical physics. Besides the fixed points explaining universality, and the least stable eigendirections accounting for convergence rates and shape corrections, the similarities include marginally stable perturbations which turn out to be generic for the Fisher-Tippett-Gumbel class. Distribution functions containing unstable perturbations are also considered. We find that, after a transitory divergence, they return to the universal fixed line at the same or at a different point depending on the type of perturbation.

  7. [Health-related behavior in a sample of Brazilian college students: gender differences].

    PubMed

    Colares, Viviane; Franca, Carolina da; Gonzalez, Emília

    2009-03-01

    This study investigated whether undergraduate students' health-risk behaviors differed according to gender. The sample consisted of 382 subjects, aged 20-29 years, from public universities in Pernambuco State, Brazil. Data were collected using the National College Health Risk Behavior Survey, previously validated in Portuguese. Descriptive and inferential statistical techniques were used. Associations were analyzed with the chi-square test or Fisher's exact test. Statistical significance was set at p < or = 0.05. In general, females engaged in the following risk behaviors less frequently than males: alcohol consumption (p = 0.005), smoking (p = 0.002), experimenting with marijuana (p = 0.002), consumption of inhalants (p < or = 0.001), steroid use (p = 0.003), carrying weapons (p = 0.001), and involvement in physical fights (p = 0.014). Meanwhile, female students displayed more concern about losing or maintaining weight, although they exercised less frequently than males. The findings thus showed statistically different health behaviors between genders. In conclusion, different approaches need to be used for the two genders.

  8. Statistical issues in searches for new phenomena in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Lyons, Louis; Wardle, Nicholas

    2018-03-01

    Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.

  9. Could the clinical interpretability of subgroups detected using clustering methods be improved by using a novel two-stage approach?

    PubMed

    Kent, Peter; Stochkendahl, Mette Jensen; Christensen, Henrik Wulff; Kongsted, Alice

    2015-01-01

    Recognition of homogeneous subgroups of patients can usefully improve prediction of their outcomes and the targeting of treatment. There are a number of research approaches that have been used to recognise homogeneity in such subgroups and to test their implications. One approach is to use statistical clustering techniques, such as Cluster Analysis or Latent Class Analysis, to detect latent relationships between patient characteristics. Influential patient characteristics can come from diverse domains of health, such as pain, activity limitation, physical impairment, social role participation, psychological factors, biomarkers and imaging. However, such 'whole person' research may result in data-driven subgroups that are complex, difficult to interpret and challenging to recognise clinically. This paper describes a novel approach to applying statistical clustering techniques that may improve the clinical interpretability of derived subgroups and reduce sample size requirements. This approach involves clustering in two sequential stages. The first stage involves clustering within health domains and therefore requires creating as many clustering models as there are health domains in the available data. This first stage produces scoring patterns within each domain. The second stage involves clustering using the scoring patterns from each health domain (from the first stage) to identify subgroups across all domains. We illustrate this using chest pain data from the baseline presentation of 580 patients. The new two-stage clustering resulted in two subgroups that approximated the classic textbook descriptions of musculoskeletal chest pain and atypical angina chest pain. The traditional single-stage clustering resulted in five clusters that were also clinically recognisable but displayed less distinct differences. In this paper, a new approach to using clustering techniques to identify clinically useful subgroups of patients is suggested. Research designs, statistical methods and outcome metrics suitable for performing that testing are also described. This approach has potential benefits but requires broad testing, in multiple patient samples, to determine its clinical value. The usefulness of the approach is likely to be context-specific, depending on the characteristics of the available data and the research question being asked of it.

  10. Modelling of electronic excitation and radiation in the Direct Simulation Monte Carlo Macroscopic Chemistry Method

    NASA Astrophysics Data System (ADS)

    Goldsworthy, M. J.

    2012-10-01

    One of the most useful tools for modelling rarefied hypersonic flows is the Direct Simulation Monte Carlo (DSMC) method. Simulator particle movement and collision calculations are combined with statistical procedures to model thermal non-equilibrium flow-fields described by the Boltzmann equation. The Macroscopic Chemistry Method for DSMC simulations was developed to simplify the inclusion of complex thermal non-equilibrium chemistry. The macroscopic approach uses statistical information which is calculated during the DSMC solution process in the modelling procedures. Here it is shown how inclusion of macroscopic information in models of chemical kinetics, electronic excitation, ionization, and radiation can enhance the capabilities of DSMC to model flow-fields where a range of physical processes occur. The approach is applied to the modelling of a 6.4 km/s nitrogen shock wave and results are compared with those from existing shock-tube experiments and continuum calculations. Reasonable agreement between the methods is obtained. The quality of the comparison is highly dependent on the set of vibrational relaxation and chemical kinetic parameters employed.

  11. A main sequence for quasars

    NASA Astrophysics Data System (ADS)

    Marziani, Paola; Dultzin, Deborah; Sulentic, Jack W.; Del Olmo, Ascensión; Negrete, C. A.; Martínez-Aldama, Mary L.; D'Onofrio, Mauro; Bon, Edi; Bon, Natasa; Stirpe, Giovanna M.

    2018-03-01

    The last 25 years saw a major step forward in the analysis of optical and UV spectroscopic data of large quasar samples. Multivariate statistical approaches have led to the definition of systematic trends in observational properties that are the basis of physical and dynamical modeling of quasar structure. We discuss the empirical correlates of the so-called “main sequence” associated with the quasar Eigenvector 1, its governing physical parameters and several implications on our view of the quasar structure, as well as some luminosity effects associated with the virialized component of the line emitting regions. We also briefly discuss quasars in a segment of the main sequence that includes the strongest FeII emitters. These sources show a small dispersion around a well-defined Eddington ratio value, a property which makes them potential Eddington standard candles.

  12. Revolutions in the earth sciences

    PubMed Central

    Allègre, C.

    1999-01-01

    The 20th century has been a century of scientific revolutions for many disciplines: quantum mechanics in physics, the atomic approach in chemistry, the nonlinear revolution in mathematics, the introduction of statistical physics. The major breakthroughs in these disciplines had all occurred by about 1930. In contrast, the revolutions in the so-called natural sciences, that is in the earth sciences and in biology, waited until the last half of the century. These revolutions were indeed late, but they were no less deep and drastic, and they occurred quite suddenly. Actually, one can say that not one but three revolutions occurred in the earth sciences: in plate tectonics, planetology and the environment. They occurred essentially independently from each other, but as time passed, their effects developed, amplified and started interacting. These effects continue strongly to this day.

  13. Neural Networks

    NASA Astrophysics Data System (ADS)

    Schwindling, Jerome

    2010-04-01

    This course presents an overview of the concepts of the neural networks and their aplication in the framework of High energy physics analyses. After a brief introduction on the concept of neural networks, the concept is explained in the frame of neuro-biology, introducing the concept of multi-layer perceptron, learning and their use as data classifer. The concept is then presented in a second part using in more details the mathematical approach focussing on typical use cases faced in particle physics. Finally, the last part presents the best way to use such statistical tools in view of event classifers, putting the emphasis on the setup of the multi-layer perceptron. The full article (15 p.) corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  14. Self-organization and feedback effects in the shock compressed media

    NASA Astrophysics Data System (ADS)

    Khantuleva, Tatyana

    2005-07-01

    New theoretical approach to the transport in condensed matter far from equilibrium combines methods of statistical mechanics and cybernetic physics in order to construct closed mathematical model of a system with self-organization and self-regulation. Mesoscopic effects are considered as a result of the structure formation and the feedback effects in an open system under dynamic loading. Nonequilibrium state equations had been involved to incorporate the velocity dispersion. Integrodifferential balance equations describe both wave and dissipative transport properties. Boundary conditions determine the internal scale spectra. The model is completed by the feedback that introduces the structure evolution basing the methods of cybernetic physics. The obtained results open a wide prospective for the control methods in applications to new technologies, intellectual systems and prediction of catastrophic phenomena.

  15. An adaptive community-based participatory approach to formative assessment with high schools for obesity intervention*.

    PubMed

    Kong, Alberta S; Farnsworth, Seth; Canaca, Jose A; Harris, Amanda; Palley, Gabriel; Sussman, Andrew L

    2012-03-01

    In the emerging debate around obesity intervention in schools, recent calls have been made for researchers to include local community opinions in the design of interventions. Community-based participatory research (CBPR) is an effective approach for forming community partnerships and integrating local opinions. We used CBPR principles to conduct formative research in identifying acceptable and potentially sustainable obesity intervention strategies in 8 New Mexico school communities. We collected formative data from 8 high schools on areas of community interest for school health improvement through collaboration with local School Health Advisory Councils (SHACs) and interviews with students and parents. A survey based on formative results was created to assess acceptability of specific intervention strategies and was provided to SHACs. Quantitative data were analyzed using descriptive statistics while qualitative data were evaluated using an iterative analytic process for thematic identification. Key themes identified through the formative process included lack of healthy food options, infrequent curricular/extracurricular physical activity opportunities, and inadequate exposure to health/nutritional information. Key strategies identified as most acceptable by SHAC members included healthier food options and preparation, a healthy foods marketing campaign, yearly taste tests, an after-school noncompetitive physical activity program, and community linkages to physical activity opportunities. An adaptive CBPR approach for formative assessment can be used to identify obesity intervention strategies that address community school health concerns. Eight high school SHACs identified 6 school-based strategies to address parental and student concerns related to obesity. © 2012, American School Health Association.

  16. An Adaptive Community-Based Participatory Approach to Formative Assessment With High Schools for Obesity Intervention*

    PubMed Central

    Kong, Alberta S.; Farnsworth, Seth; Canaca, Jose A.; Harris, Amanda; Palley, Gabriel; Sussman, Andrew L.

    2013-01-01

    BACKGROUND In the emerging debate around obesity intervention in schools, recent calls have been made for researchers to include local community opinions in the design of interventions. Community-based participatory research (CBPR) is an effective approach for forming community partnerships and integrating local opinions. We used CBPR principles to conduct formative research in identifying acceptable and potentially sustainable obesity intervention strategies in 8 New Mexico school communities. METHODS We collected formative data from 8 high schools on areas of community interest for school health improvement through collaboration with local School Health Advisory Councils (SHACs) and interviews with students and parents. A survey based on formative results was created to assess acceptability of specific intervention strategies and was provided to SHACs. Quantitative data were analyzed using descriptive statistics while qualitative data were evaluated using an iterative analytic process for thematic identification. RESULTS Key themes identified through the formative process included lack of healthy food options, infrequent curricular/extracurricular physical activity opportunities, and inadequate exposure to health/nutritional information. Key strategies identified as most acceptable by SHAC members included healthier food options and preparation, a healthy foods marketing campaign, yearly taste tests, an after-school noncompetitive physical activity program, and community linkages to physical activity opportunities. CONCLUSION An adaptive CBPR approach for formative assessment can be used to identify obesity intervention strategies that address community school health concerns. Eight high school SHACs identified 6 school-based strategies to address parental and student concerns related to obesity. PMID:22320339

  17. Credit risk and the instability of the financial system: An ensemble approach

    NASA Astrophysics Data System (ADS)

    Schmitt, Thilo A.; Chetalova, Desislava; Schäfer, Rudi; Guhr, Thomas

    2014-02-01

    The instability of the financial system as experienced in recent years and in previous periods is often linked to credit defaults, i.e., to the failure of obligors to make promised payments. Given the large number of credit contracts, this problem is amenable to be treated with approaches developed in statistical physics. We introduce the idea of ensemble averaging and thereby uncover generic features of credit risk. We then show that the often advertised concept of diversification, i.e., reducing the risk by distributing it, is deeply flawed when it comes to credit risk. The risk of extreme losses remains due to the ever present correlations, implying a substantial and persistent intrinsic danger to the financial system.

  18. The extraction of N,N-dialkylamides III. A thermodynamical approach of the multicomponent extraction organic media by a statistical mechanic theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Condamines, N.; Musikas, C.; Turq, P.

    1993-04-01

    The non-ideality of multicomponent media are difficult to describe, especially for situations as complex as the extraction of metals into organic media. We present a simplified model which takes into account hard-sphere' effects and physical interactions between some solutes of the studied media in the case of actinide ions liquid-liquid extraction. We focus our interest on N,N-dialkylamides extractants which have a strong non-ideal behaviour. 24 refs., 10 figs., 6 tabs.

  19. Schrödinger equation revisited

    PubMed Central

    Schleich, Wolfgang P.; Greenberger, Daniel M.; Kobe, Donald H.; Scully, Marlan O.

    2013-01-01

    The time-dependent Schrödinger equation is a cornerstone of quantum physics and governs all phenomena of the microscopic world. However, despite its importance, its origin is still not widely appreciated and properly understood. We obtain the Schrödinger equation from a mathematical identity by a slight generalization of the formulation of classical statistical mechanics based on the Hamilton–Jacobi equation. This approach brings out most clearly the fact that the linearity of quantum mechanics is intimately connected to the strong coupling between the amplitude and phase of a quantum wave. PMID:23509260

  20. Ensemble inequivalence and Maxwell construction in the self-gravitating ring model

    NASA Astrophysics Data System (ADS)

    Rocha Filho, T. M.; Silvestre, C. H.; Amato, M. A.

    2018-06-01

    The statement that Gibbs equilibrium ensembles are equivalent is a base line in many approaches in the context of equilibrium statistical mechanics. However, as a known fact, for some physical systems this equivalence may not be true. In this paper we illustrate from first principles the inequivalence between the canonical and microcanonical ensembles for a system with long range interactions. We make use of molecular dynamics simulations and Monte Carlo simulations to explore the thermodynamics properties of the self-gravitating ring model and discuss on what conditions the Maxwell construction is applicable.

  1. Native American Participation among Bachelors in Physical Sciences and Engineering: Results from 2003-13 Data of the National Center for Education Statistics. Focus On

    ERIC Educational Resources Information Center

    Merner, Laura; Tyler, John

    2017-01-01

    Using the National Center of Education Statistics' Integrated Postsecondary Education Data System (IPEDS), this report analyzes data on Native American recipients of bachelor's degrees among 16 physical science and engineering fields. Overall, Native Americans are earning physical science and engineering bachelor's degrees at lower rates than the…

  2. Chemical and Physical Sensing in the Petroleum Industry

    NASA Astrophysics Data System (ADS)

    Disko, Mark

    2008-03-01

    World-scale oil, gas and petrochemical production relies on a myriad of advanced technologies for discovering, producing, transporting, processing and distributing hydrocarbons. Sensing systems provide rapid and targeted information that can be used for expanding resources, improving product quality, and assuring environmentally sound operations. For example, equipment such as reactors and pipelines can be operated with high efficiency and safety with improved chemical and physical sensors for corrosion and hydrocarbon detection. At the interface between chemical engineering and multiphase flow physics, ``multi-scale'' phenomena such as catalysis and heat flow benefit from new approaches to sensing and data modeling. We are combining chemically selective micro-cantilevers, fiber optic sensing, and acoustic monitoring with statistical data fusion approaches to maximize control information. Miniaturized analyzers represent a special opportunity, including the nanotech-based quantum cascade laser systems for mid-infrared spectroscopy. Specific examples for use of these new micro-systems include rapid monocyclic aromatic molecule identification and measurement under ambient conditions at weight ppb levels. We see promise from emerging materials and devices based on nanotechnology, which can one day be available at modest cost for impact in existing operations. Controlled surface energies and emerging chemical probes hold the promise for reduction in greenhouse gas emissions for current fuels and future transportation and energy technologies.

  3. A data-driven approach for evaluating multi-modal therapy in traumatic brain injury

    PubMed Central

    Haefeli, Jenny; Ferguson, Adam R.; Bingham, Deborah; Orr, Adrienne; Won, Seok Joon; Lam, Tina I.; Shi, Jian; Hawley, Sarah; Liu, Jialing; Swanson, Raymond A.; Massa, Stephen M.

    2017-01-01

    Combination therapies targeting multiple recovery mechanisms have the potential for additive or synergistic effects, but experimental design and analyses of multimodal therapeutic trials are challenging. To address this problem, we developed a data-driven approach to integrate and analyze raw source data from separate pre-clinical studies and evaluated interactions between four treatments following traumatic brain injury. Histologic and behavioral outcomes were measured in 202 rats treated with combinations of an anti-inflammatory agent (minocycline), a neurotrophic agent (LM11A-31), and physical therapy consisting of assisted exercise with or without botulinum toxin-induced limb constraint. Data was curated and analyzed in a linked workflow involving non-linear principal component analysis followed by hypothesis testing with a linear mixed model. Results revealed significant benefits of the neurotrophic agent LM11A-31 on learning and memory outcomes after traumatic brain injury. In addition, modulations of LM11A-31 effects by co-administration of minocycline and by the type of physical therapy applied reached statistical significance. These results suggest a combinatorial effect of drug and physical therapy interventions that was not evident by univariate analysis. The study designs and analytic techniques applied here form a structured, unbiased, internally validated workflow that may be applied to other combinatorial studies, both in animals and humans. PMID:28205533

  4. A data-driven approach for evaluating multi-modal therapy in traumatic brain injury.

    PubMed

    Haefeli, Jenny; Ferguson, Adam R; Bingham, Deborah; Orr, Adrienne; Won, Seok Joon; Lam, Tina I; Shi, Jian; Hawley, Sarah; Liu, Jialing; Swanson, Raymond A; Massa, Stephen M

    2017-02-16

    Combination therapies targeting multiple recovery mechanisms have the potential for additive or synergistic effects, but experimental design and analyses of multimodal therapeutic trials are challenging. To address this problem, we developed a data-driven approach to integrate and analyze raw source data from separate pre-clinical studies and evaluated interactions between four treatments following traumatic brain injury. Histologic and behavioral outcomes were measured in 202 rats treated with combinations of an anti-inflammatory agent (minocycline), a neurotrophic agent (LM11A-31), and physical therapy consisting of assisted exercise with or without botulinum toxin-induced limb constraint. Data was curated and analyzed in a linked workflow involving non-linear principal component analysis followed by hypothesis testing with a linear mixed model. Results revealed significant benefits of the neurotrophic agent LM11A-31 on learning and memory outcomes after traumatic brain injury. In addition, modulations of LM11A-31 effects by co-administration of minocycline and by the type of physical therapy applied reached statistical significance. These results suggest a combinatorial effect of drug and physical therapy interventions that was not evident by univariate analysis. The study designs and analytic techniques applied here form a structured, unbiased, internally validated workflow that may be applied to other combinatorial studies, both in animals and humans.

  5. High Throughput Determination of Mercury in Tobacco and Mainstream Smoke from Little Cigars

    PubMed Central

    Fresquez, Mark R.; Gonzalez-Jimenez, Nathalie; Gray, Naudia; Watson, Clifford H.; Pappas, R. Steven

    2015-01-01

    A method was developed that utilizes a platinum trap for mercury from mainstream tobacco smoke which represents an improvement over traditional approaches that require impingers and long sample preparation procedures. In this approach, the trapped mercury is directly released for analysis by heating the trap in a direct mercury analyzer. The method was applied to the analysis of mercury in the mainstream smoke of little cigars. The mercury levels in little cigar smoke obtained under Health Canada Intense smoking machine conditions ranged from 7.1 × 10−3 mg/m3 to 1.2 × 10−2 mg/m3. These air mercury levels exceed the chronic inhalation Minimal Risk Level corrected for intermittent exposure to metallic mercury (e.g., 1 or 2 hours per day, 5 days per week) determined by the Agency for Toxic Substances and Disease Registry. Multivariate statistical analysis was used to assess associations between mercury levels and little cigar physical design properties. Filter ventilation was identified as the principal physical parameter influencing mercury concentrations in mainstream little cigar smoke generated under ISO machine smoking conditions. With filter ventilation blocked under Health Canada Intense smoking conditions, mercury concentrations in tobacco and puff number (smoke volume) were the primary physical parameters that influenced mainstream smoke mercury concentrations. PMID:26051388

  6. Reaction rates for mesoscopic reaction-diffusion kinetics

    DOE PAGES

    Hellander, Stefan; Hellander, Andreas; Petzold, Linda

    2015-02-23

    The mesoscopic reaction-diffusion master equation (RDME) is a popular modeling framework frequently applied to stochastic reaction-diffusion kinetics in systems biology. The RDME is derived from assumptions about the underlying physical properties of the system, and it may produce unphysical results for models where those assumptions fail. In that case, other more comprehensive models are better suited, such as hard-sphere Brownian dynamics (BD). Although the RDME is a model in its own right, and not inferred from any specific microscale model, it proves useful to attempt to approximate a microscale model by a specific choice of mesoscopic reaction rates. In thismore » paper we derive mesoscopic scale-dependent reaction rates by matching certain statistics of the RDME solution to statistics of the solution of a widely used microscopic BD model: the Smoluchowski model with a Robin boundary condition at the reaction radius of two molecules. We also establish fundamental limits on the range of mesh resolutions for which this approach yields accurate results and show both theoretically and in numerical examples that as we approach the lower fundamental limit, the mesoscopic dynamics approach the microscopic dynamics. Finally, we show that for mesh sizes below the fundamental lower limit, results are less accurate. Thus, the lower limit determines the mesh size for which we obtain the most accurate results.« less

  7. Reaction rates for mesoscopic reaction-diffusion kinetics

    PubMed Central

    Hellander, Stefan; Hellander, Andreas; Petzold, Linda

    2016-01-01

    The mesoscopic reaction-diffusion master equation (RDME) is a popular modeling framework frequently applied to stochastic reaction-diffusion kinetics in systems biology. The RDME is derived from assumptions about the underlying physical properties of the system, and it may produce unphysical results for models where those assumptions fail. In that case, other more comprehensive models are better suited, such as hard-sphere Brownian dynamics (BD). Although the RDME is a model in its own right, and not inferred from any specific microscale model, it proves useful to attempt to approximate a microscale model by a specific choice of mesoscopic reaction rates. In this paper we derive mesoscopic scale-dependent reaction rates by matching certain statistics of the RDME solution to statistics of the solution of a widely used microscopic BD model: the Smoluchowski model with a Robin boundary condition at the reaction radius of two molecules. We also establish fundamental limits on the range of mesh resolutions for which this approach yields accurate results and show both theoretically and in numerical examples that as we approach the lower fundamental limit, the mesoscopic dynamics approach the microscopic dynamics. We show that for mesh sizes below the fundamental lower limit, results are less accurate. Thus, the lower limit determines the mesh size for which we obtain the most accurate results. PMID:25768640

  8. Integrating non-colocated well and geophysical data to capture subsurface heterogeneity at an aquifer recharge and recovery site

    NASA Astrophysics Data System (ADS)

    Gottschalk, Ian P.; Hermans, Thomas; Knight, Rosemary; Caers, Jef; Cameron, David A.; Regnery, Julia; McCray, John E.

    2017-12-01

    Geophysical data have proven to be very useful for lithological characterization. However, quantitatively integrating the information gained from acquiring geophysical data generally requires colocated lithological and geophysical data for constructing a rock-physics relationship. In this contribution, the issue of integrating noncolocated geophysical and lithological data is addressed, and the results are applied to simulate groundwater flow in a heterogeneous aquifer in the Prairie Waters Project North Campus aquifer recharge site, Colorado. Two methods of constructing a rock-physics transform between electrical resistivity tomography (ERT) data and lithology measurements are assessed. In the first approach, a maximum likelihood estimation (MLE) is used to fit a bimodal lognormal distribution to horizontal crosssections of the ERT resistivity histogram. In the second approach, a spatial bootstrap is applied to approximate the rock-physics relationship. The rock-physics transforms provide soft data for multiple point statistics (MPS) simulations. Subsurface models are used to run groundwater flow and tracer test simulations. Each model's uncalibrated, predicted breakthrough time is evaluated based on its agreement with measured subsurface travel time values from infiltration basins to selected groundwater recovery wells. We find that incorporating geophysical information into uncalibrated flow models reduces the difference with observed values, as compared to flow models without geophysical information incorporated. The integration of geophysical data also narrows the variance of predicted tracer breakthrough times substantially. Accuracy is highest and variance is lowest in breakthrough predictions generated by the MLE-based rock-physics transform. Calibrating the ensemble of geophysically constrained models would help produce a suite of realistic flow models for predictive purposes at the site. We find that the success of breakthrough predictions is highly sensitive to the definition of the rock-physics transform; it is therefore important to model this transfer function accurately.

  9. Patterns of Circulating Inflammatory Biomarkers in Older Persons with Varying Levels of Physical Performance: A Partial Least Squares-Discriminant Analysis Approach

    PubMed Central

    Marzetti, Emanuele; Landi, Francesco; Marini, Federico; Cesari, Matteo; Buford, Thomas W.; Manini, Todd M.; Onder, Graziano; Pahor, Marco; Bernabei, Roberto; Leeuwenburgh, Christiaan; Calvani, Riccardo

    2014-01-01

    Background: Chronic, low-grade inflammation and declining physical function are hallmarks of the aging process. However, previous attempts to correlate individual inflammatory biomarkers with physical performance in older people have produced mixed results. Given the complexity of the inflammatory response, the simultaneous analysis of an array of inflammatory mediators may provide more insights into the relationship between inflammation and age-related physical function decline. This study was designed to explore the association between a panel of inflammatory markers and physical performance in older adults through a multivariate statistical approach. Methods: Community-dwelling older persons were categorized into “normal walkers” (NWs; n = 27) or “slow walkers” (SWs; n = 11) groups using 0.8 m s−1 as the 4-m gait speed cutoff. A panel of 14 circulating inflammatory biomarkers was assayed by multiplex analysis. Partial least squares-discriminant analysis (PLS-DA) was used to identify patterns of inflammatory mediators associated with gait speed categories. Results: The optimal complexity of the PLS-DA model was found to be five latent variables. The proportion of correct classification was 88.9% for NW subjects (74.1% in cross-validation) and 90.9% for SW individuals (81.8% in cross-validation). Discriminant biomarkers in the model were interleukin 8, myeloperoxidase, and tumor necrosis factor alpha (all higher in the SW group), and P-selectin, interferon gamma, and granulocyte–macrophage colony-stimulating factor (all higher in the NW group). Conclusion: Distinct profiles of circulating inflammatory biomarkers characterize older subjects with different levels of physical performance. The dissection of these patterns may provide novel insights into the role played by inflammation in the disabling cascade and possible new targets for interventions. PMID:25593902

  10. Risk Factors for Problem Gambling in California: Demographics, Comorbidities and Gambling Participation.

    PubMed

    Volberg, Rachel A; McNamara, Lauren M; Carris, Kari L

    2018-06-01

    While population surveys have been carried out in numerous jurisdictions internationally, little has been done to assess the relative strength of different risk factors that may contribute to the development of problem gambling. This is an important preparatory step for future research on the etiology of problem gambling. Using data from the 2006 California Problem Gambling Prevalence Survey, a telephone survey of adult California residents that used the NODS to assess respondents for gambling problems, binary logistic regression analysis was used to identify demographic characteristics, health-related behaviors, and gambling participation variables that statistically predicted the odds of being a problem or pathological gambler. In a separate approach, linear regression analysis was used to assess the impact of changes in these variables on the severity of the disorder. In both of the final models, the greatest statistical predictor of problem gambling status was past year Internet gambling. Furthermore, the unique finding of a significant interaction between physical or mental disability, Internet gambling, and problem gambling highlights the importance of exploring the interactions between different forms of gambling, the experience of mental and physical health issues, and the development of problem gambling using a longitudinal lens.

  11. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis

    PubMed Central

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    Purpose: To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. Method: A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. Results: In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: −2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: −5.30 to 6.01). Conclusions: The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other. PMID:29349338

  12. A Bayesian Approach to Evaluating Consistency between Climate Model Output and Observations

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Cressie, N.; Teixeira, J.

    2010-12-01

    Like other scientific and engineering problems that involve physical modeling of complex systems, climate models can be evaluated and diagnosed by comparing their output to observations of similar quantities. Though the global remote sensing data record is relatively short by climate research standards, these data offer opportunities to evaluate model predictions in new ways. For example, remote sensing data are spatially and temporally dense enough to provide distributional information that goes beyond simple moments to allow quantification of temporal and spatial dependence structures. In this talk, we propose a new method for exploiting these rich data sets using a Bayesian paradigm. For a collection of climate models, we calculate posterior probabilities its members best represent the physical system each seeks to reproduce. The posterior probability is based on the likelihood that a chosen summary statistic, computed from observations, would be obtained when the model's output is considered as a realization from a stochastic process. By exploring how posterior probabilities change with different statistics, we may paint a more quantitative and complete picture of the strengths and weaknesses of the models relative to the observations. We demonstrate our method using model output from the CMIP archive, and observations from NASA's Atmospheric Infrared Sounder.

  13. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  14. Physical descriptions of the bacterial nucleoid at large scales, and their biological implications

    NASA Astrophysics Data System (ADS)

    Benza, Vincenzo G.; Bassetti, Bruno; Dorfman, Kevin D.; Scolari, Vittore F.; Bromek, Krystyna; Cicuta, Pietro; Cosentino Lagomarsino, Marco

    2012-07-01

    Recent experimental and theoretical approaches have attempted to quantify the physical organization (compaction and geometry) of the bacterial chromosome with its complement of proteins (the nucleoid). The genomic DNA exists in a complex and dynamic protein-rich state, which is highly organized at various length scales. This has implications for modulating (when not directly enabling) the core biological processes of replication, transcription and segregation. We overview the progress in this area, driven in the last few years by new scientific ideas and new interdisciplinary experimental techniques, ranging from high space- and time-resolution microscopy to high-throughput genomics employing sequencing to map different aspects of the nucleoid-related interactome. The aim of this review is to present the wide spectrum of experimental and theoretical findings coherently, from a physics viewpoint. In particular, we highlight the role that statistical and soft condensed matter physics play in describing this system of fundamental biological importance, specifically reviewing classic and more modern tools from the theory of polymers. We also discuss some attempts toward unifying interpretations of the current results, pointing to possible directions for future investigation.

  15. Three-dimensional virtual planning in orthognathic surgery enhances the accuracy of soft tissue prediction.

    PubMed

    Van Hemelen, Geert; Van Genechten, Maarten; Renier, Lieven; Desmedt, Maria; Verbruggen, Elric; Nadjmi, Nasser

    2015-07-01

    Throughout the history of computing, shortening the gap between the physical and digital world behind the screen has always been strived for. Recent advances in three-dimensional (3D) virtual surgery programs have reduced this gap significantly. Although 3D assisted surgery is now widely available for orthognathic surgery, one might still argue whether a 3D virtual planning approach is a better alternative to a conventional two-dimensional (2D) planning technique. The purpose of this study was to compare the accuracy of a traditional 2D technique and a 3D computer-aided prediction method. A double blind randomised prospective study was performed to compare the prediction accuracy of a traditional 2D planning technique versus a 3D computer-aided planning approach. The accuracy of the hard and soft tissue profile predictions using both planning methods was investigated. There was a statistically significant difference between 2D and 3D soft tissue planning (p < 0.05). The statistically significant difference found between 2D and 3D planning and the actual soft tissue outcome was not confirmed by a statistically significant difference between methods. The 3D planning approach provides more accurate soft tissue planning. However, the 2D orthognathic planning is comparable to 3D planning when it comes to hard tissue planning. This study provides relevant results for choosing between 3D and 2D planning in clinical practice. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  16. The pdf approach to turbulent polydispersed two-phase flows

    NASA Astrophysics Data System (ADS)

    Minier, Jean-Pierre; Peirano, Eric

    2001-10-01

    The purpose of this paper is to develop a probabilistic approach to turbulent polydispersed two-phase flows. The two-phase flows considered are composed of a continuous phase, which is a turbulent fluid, and a dispersed phase, which represents an ensemble of discrete particles (solid particles, droplets or bubbles). Gathering the difficulties of turbulent flows and of particle motion, the challenge is to work out a general modelling approach that meets three requirements: to treat accurately the physically relevant phenomena, to provide enough information to address issues of complex physics (combustion, polydispersed particle flows, …) and to remain tractable for general non-homogeneous flows. The present probabilistic approach models the statistical dynamics of the system and consists in simulating the joint probability density function (pdf) of a number of fluid and discrete particle properties. A new point is that both the fluid and the particles are included in the pdf description. The derivation of the joint pdf model for the fluid and for the discrete particles is worked out in several steps. The mathematical properties of stochastic processes are first recalled. The various hierarchies of pdf descriptions are detailed and the physical principles that are used in the construction of the models are explained. The Lagrangian one-particle probabilistic description is developed first for the fluid alone, then for the discrete particles and finally for the joint fluid and particle turbulent systems. In the case of the probabilistic description for the fluid alone or for the discrete particles alone, numerical computations are presented and discussed to illustrate how the method works in practice and the kind of information that can be extracted from it. Comments on the current modelling state and propositions for future investigations which try to link the present work with other ideas in physics are made at the end of the paper.

  17. Joint inversion of marine seismic AVA and CSEM data using statistical rock-physics models and Markov random fields: Stochastic inversion of AVA and CSEM data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, J.; Hoversten, G.M.

    2011-09-15

    Joint inversion of seismic AVA and CSEM data requires rock-physics relationships to link seismic attributes to electrical properties. Ideally, we can connect them through reservoir parameters (e.g., porosity and water saturation) by developing physical-based models, such as Gassmann’s equations and Archie’s law, using nearby borehole logs. This could be difficult in the exploration stage because information available is typically insufficient for choosing suitable rock-physics models and for subsequently obtaining reliable estimates of the associated parameters. The use of improper rock-physics models and the inaccuracy of the estimates of model parameters may cause misleading inversion results. Conversely, it is easy tomore » derive statistical relationships among seismic and electrical attributes and reservoir parameters from distant borehole logs. In this study, we develop a Bayesian model to jointly invert seismic AVA and CSEM data for reservoir parameter estimation using statistical rock-physics models; the spatial dependence of geophysical and reservoir parameters are carried out by lithotypes through Markov random fields. We apply the developed model to a synthetic case, which simulates a CO{sub 2} monitoring application. We derive statistical rock-physics relations from borehole logs at one location and estimate seismic P- and S-wave velocity ratio, acoustic impedance, density, electrical resistivity, lithotypes, porosity, and water saturation at three different locations by conditioning to seismic AVA and CSEM data. Comparison of the inversion results with their corresponding true values shows that the correlation-based statistical rock-physics models provide significant information for improving the joint inversion results.« less

  18. Forecasting runout of rock and debris avalanches

    USGS Publications Warehouse

    Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.

    2006-01-01

    Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.

  19. A new universality class in corpus of texts; A statistical physics study

    NASA Astrophysics Data System (ADS)

    Najafi, Elham; Darooneh, Amir H.

    2018-05-01

    Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.

  20. A statistical physics perspective on criticality in financial markets

    NASA Astrophysics Data System (ADS)

    Bury, Thomas

    2013-11-01

    Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.

  1. Some past and present challenges of econophysics

    NASA Astrophysics Data System (ADS)

    Mantegna, R. N.

    2016-12-01

    We discuss the cultural background that was shared by some of the first econophysicists when they started to work on economic and financial problems with methods and tools of statistical physics. In particular we discuss about the role of stylized facts and statistical physical laws in economics and statistical physics respectively. As an example of the problems and potentials associated with the interaction of different communities of scholars dealing with problems observed in economic and financial systems we briefly discuss the development and the perspectives of the use of tools and concepts of networks in econophysics, economics and finance.

  2. Statistical physics inspired energy-efficient coded-modulation for optical communications.

    PubMed

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2012-04-15

    Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America

  3. On the role of fluctuations in the modeling of complex systems.

    NASA Astrophysics Data System (ADS)

    Droz, Michel; Pekalski, Andrzej

    2016-09-01

    The study of models is ubiquitous in sciences like physics, chemistry, ecology, biology or sociology. Models are used to explain experimental facts or to make new predictions. For any system, one can distinguish several levels of description. In the simplest mean-field like description the dynamics is described in terms of spatially averaged quantities while in a microscopic approach local properties are taken into account and local fluctuations for the relevant variables are present. The properties predicted by these two different approaches may be drastically different. In a large body of research literature concerning complex systems this problem is often overlooked and simple mean-field like approximation are used without asking the question of the robustness of the corresponding predictions. The goal of this paper is twofold, first to illustrate the importance of the fluctuations in a self-contained and pedagogical way, by revisiting two different classes of problems where thorough investigations have been conducted (equilibrium and non-equilibrium statistical physics). Second, we present our original research on the dynamics of population of annual plants which are competing among themselves for just one resource (water) through a stochastic dynamics. Depending on the observable considered, the mean-field like and microscopic approaches agree or totally disagree. There is not a general criterion allowing to decide a priori when the two approaches will agree.

  4. Theory of atomic spectral emission intensity

    NASA Astrophysics Data System (ADS)

    Yngström, Sten

    1994-07-01

    The theoretical derivation of a new spectral line intensity formula for atomic radiative emission is presented. The theory is based on first principles of quantum physics, electrodynamics, and statistical physics. Quantum rules lead to revision of the conventional principle of local thermal equilibrium of matter and radiation. Study of electrodynamics suggests absence of spectral emission from fractions of the numbers of atoms and ions in a plasma due to radiative inhibition caused by electromagnetic force fields. Statistical probability methods are extended by the statement: A macroscopic physical system develops in the most probable of all conceivable ways consistent with the constraining conditions for the system. The crucial role of statistical physics in transforming quantum logic into common sense logic is stressed. The theory is strongly supported by experimental evidence.

  5. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  6. Nonperturbative Renormalization Group Approach to Polymerized Membranes

    NASA Astrophysics Data System (ADS)

    Essafi, Karim; Kownacki, Jean-Philippe; Mouhanna, Dominique

    2014-03-01

    Membranes or membrane-like materials play an important role in many fields ranging from biology to physics. These systems form a very rich domain in statistical physics. The interplay between geometry and thermal fluctuations lead to exciting phases such flat, tubular and disordered flat phases. Roughly speaking, membranes can be divided into two group: fluid membranes in which the molecules are free to diffuse and thus no shear modulus. On the other hand, in polymerized membranes the connectivity is fixed which leads to elastic forces. This difference between fluid and polymerized membranes leads to a difference in their critical behaviour. For instance, fluid membranes are always crumpled, whereas polymerized membranes exhibit a phase transition between a crumpled phase and a flat phase. In this talk, I will focus only on polymerized phantom, i.e. non-self-avoiding, membranes. The critical behaviour of both isotropic and anisotropic polymerized membranes are studied using a nonperturbative renormalization group approach (NPRG). This allows for the investigation of the phase transitions and the low temperature flat phase in any internal dimension D and embedding d. Interestingly, graphene behaves just as a polymerized membrane in its flat phase.

  7. The Gaussian CL s method for searches of new physics

    DOE PAGES

    Qian, X.; Tan, A.; Ling, J. J.; ...

    2016-04-23

    Here we describe a method based on the CL s approach to present results in searches of new physics, under the condition that the relevant parameter space is continuous. Our method relies on a class of test statistics developed for non-nested hypotheses testing problems, denoted by ΔT, which has a Gaussian approximation to its parent distribution when the sample size is large. This leads to a simple procedure of forming exclusion sets for the parameters of interest, which we call the Gaussian CL s method. Our work provides a self-contained mathematical proof for the Gaussian CL s method, that explicitlymore » outlines the required conditions. These conditions are milder than that required by the Wilks' theorem to set confidence intervals (CIs). We illustrate the Gaussian CL s method in an example of searching for a sterile neutrino, where the CL s approach was rarely used before. We also compare data analysis results produced by the Gaussian CL s method and various CI methods to showcase their differences.« less

  8. Improving Metallic Thermal Protection System Hypervelocity Impact Resistance Through Design of Experiments Approach

    NASA Technical Reports Server (NTRS)

    Poteet, Carl C.; Blosser, Max L.

    2001-01-01

    A design of experiments approach has been implemented using computational hypervelocity impact simulations to determine the most effective place to add mass to an existing metallic Thermal Protection System (TPS) to improve hypervelocity impact protection. Simulations were performed using axisymmetric models in CTH, a shock-physics code developed by Sandia National Laboratories, and validated by comparison with existing test data. The axisymmetric models were then used in a statistical sensitivity analysis to determine the influence of five design parameters on degree of hypervelocity particle dispersion. Several damage metrics were identified and evaluated. Damage metrics related to the extent of substructure damage were seen to produce misleading results, however damage metrics related to the degree of dispersion of the hypervelocity particle produced results that corresponded to physical intuition. Based on analysis of variance results it was concluded that the most effective way to increase hypervelocity impact resistance is to increase the thickness of the outer foil layer. Increasing the spacing between the outer surface and the substructure is also very effective at increasing dispersion.

  9. On a simple molecular–statistical model of a liquid-crystal suspension of anisometric particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakhlevnykh, A. N., E-mail: anz@psu.ru; Lubnin, M. S.; Petrov, D. A.

    2016-11-15

    A molecular–statistical mean-field theory is constructed for suspensions of anisometric particles in nematic liquid crystals (NLCs). The spherical approximation, well known in the physics of ferromagnetic materials, is considered that allows one to obtain an analytic expression for the free energy and simple equations for the orientational state of a suspension that describe the temperature dependence of the order parameters of the suspension components. The transition temperature from ordered to isotropic state and the jumps in the order parameters at the phase-transition point are studied as a function of the anchoring energy of dispersed particles to the matrix, the concentrationmore » of the impurity phase, and the size of particles. The proposed approach allows one to generalize the model to the case of biaxial ordering.« less

  10. The time-local view of nonequilibrium statistical mechanics. I. Linear theory of transport and relaxation

    NASA Astrophysics Data System (ADS)

    der, R.

    1987-01-01

    The various approaches to nonequilibrium statistical mechanics may be subdivided into convolution and convolutionless (time-local) ones. While the former, put forward by Zwanzig, Mori, and others, are used most commonly, the latter are less well developed, but have proven very useful in recent applications. The aim of the present series of papers is to develop the time-local picture (TLP) of nonequilibrium statistical mechanics on a new footing and to consider its physical implications for topics such as the formulation of irreversible thermodynamics. The most natural approach to TLP is seen to derive from the Fourier-Laplace transformwidetilde{C}(z)) of pertinent time correlation functions, which on the physical sheet typically displays an essential singularity at z=∞ and a number of macroscopic and microscopic poles in the lower half-plane corresponding to long- and short-lived modes, respectively, the former giving rise to the autonomous macrodynamics, whereas the latter are interpreted as doorway modes mediating the transfer of information from relevant to irrelevant channels. Possible implications of this doorway mode concept for socalled extended irreversible thermodynamics are briefly discussed. The pole structure is used for deriving new kinds of generalized Green-Kubo relations expressing macroscopic quantities, transport coefficients, e.g., by contour integrals over current-current correlation functions obeying Hamiltonian dynamics, the contour integration replacing projection. The conventional Green-Kubo relations valid for conserved quantities only are rederived for illustration. Moreover,widetilde{C}(z) may be expressed by a Laurent series expansion in positive and negative powers of z, from which a rigorous, general, and straightforward method is developed for extracting all macroscopic quantities from so-called secularly divergent expansions ofwidetilde{C}(z) as obtained from the application of conventional many-body techniques to the calculation ofwidetilde{C}(z). The expressions are formulated as time scale expansions, which should rapidly converge if macroscopic and microscopic time scales are sufficiently well separated, i.e., if lifetime ("memory") effects are not too large.

  11. Estimation of lifetime distributions on 1550-nm DFB laser diodes using Monte-Carlo statistic computations

    NASA Astrophysics Data System (ADS)

    Deshayes, Yannick; Verdier, Frederic; Bechou, Laurent; Tregon, Bernard; Danto, Yves; Laffitte, Dominique; Goudard, Jean Luc

    2004-09-01

    High performance and high reliability are two of the most important goals driving the penetration of optical transmission into telecommunication systems ranging from 880 nm to 1550 nm. Lifetime prediction defined as the time at which a parameter reaches its maximum acceptable shirt still stays the main result in terms of reliability estimation for a technology. For optoelectronic emissive components, selection tests and life testing are specifically used for reliability evaluation according to Telcordia GR-468 CORE requirements. This approach is based on extrapolation of degradation laws, based on physics of failure and electrical or optical parameters, allowing both strong test time reduction and long-term reliability prediction. Unfortunately, in the case of mature technology, there is a growing complexity to calculate average lifetime and failure rates (FITs) using ageing tests in particular due to extremely low failure rates. For present laser diode technologies, time to failure tend to be 106 hours aged under typical conditions (Popt=10 mW and T=80°C). These ageing tests must be performed on more than 100 components aged during 10000 hours mixing different temperatures and drive current conditions conducting to acceleration factors above 300-400. These conditions are high-cost, time consuming and cannot give a complete distribution of times to failure. A new approach consists in use statistic computations to extrapolate lifetime distribution and failure rates in operating conditions from physical parameters of experimental degradation laws. In this paper, Distributed Feedback single mode laser diodes (DFB-LD) used for 1550 nm telecommunication network working at 2.5 Gbit/s transfer rate are studied. Electrical and optical parameters have been measured before and after ageing tests, performed at constant current, according to Telcordia GR-468 requirements. Cumulative failure rates and lifetime distributions are computed using statistic calculations and equations of drift mechanisms versus time fitted from experimental measurements.

  12. Identifying subgroups of patients using latent class analysis: should we use a single-stage or a two-stage approach? A methodological study using a cohort of patients with low back pain.

    PubMed

    Nielsen, Anne Molgaard; Kent, Peter; Hestbaek, Lise; Vach, Werner; Kongsted, Alice

    2017-02-01

    Heterogeneity in patients with low back pain (LBP) is well recognised and different approaches to subgrouping have been proposed. Latent Class Analysis (LCA) is a statistical technique that is increasingly being used to identify subgroups based on patient characteristics. However, as LBP is a complex multi-domain condition, the optimal approach when using LCA is unknown. Therefore, this paper describes the exploration of two approaches to LCA that may help improve the identification of clinically relevant and interpretable LBP subgroups. From 928 LBP patients consulting a chiropractor, baseline data were used as input to the statistical subgrouping. In a single-stage LCA, all variables were modelled simultaneously to identify patient subgroups. In a two-stage LCA, we used the latent class membership from our previously published LCA within each of six domains of health (activity, contextual factors, pain, participation, physical impairment and psychology) (first stage) as the variables entered into the second stage of the two-stage LCA to identify patient subgroups. The description of the results of the single-stage and two-stage LCA was based on a combination of statistical performance measures, qualitative evaluation of clinical interpretability (face validity) and a subgroup membership comparison. For the single-stage LCA, a model solution with seven patient subgroups was preferred, and for the two-stage LCA, a nine patient subgroup model. Both approaches identified similar, but not identical, patient subgroups characterised by (i) mild intermittent LBP, (ii) recent severe LBP and activity limitations, (iii) very recent severe LBP with both activity and participation limitations, (iv) work-related LBP, (v) LBP and several negative consequences and (vi) LBP with nerve root involvement. Both approaches identified clinically interpretable patient subgroups. The potential importance of these subgroups needs to be investigated by exploring whether they can be identified in other cohorts and by examining their possible association with patient outcomes. This may inform the selection of a preferred LCA approach.

  13. A direct sensitivity approach to predict hourly ozone resulting from compliance with the National Ambient Air Quality Standard.

    PubMed

    Simon, Heather; Baker, Kirk R; Akhtar, Farhan; Napelenok, Sergey L; Possiel, Norm; Wells, Benjamin; Timin, Brian

    2013-03-05

    In setting primary ambient air quality standards, the EPA's responsibility under the law is to establish standards that protect public health. As part of the current review of the ozone National Ambient Air Quality Standard (NAAQS), the US EPA evaluated the health exposure and risks associated with ambient ozone pollution using a statistical approach to adjust recent air quality to simulate just meeting the current standard level, without specifying emission control strategies. One drawback of this purely statistical concentration rollback approach is that it does not take into account spatial and temporal heterogeneity of ozone response to emissions changes. The application of the higher-order decoupled direct method (HDDM) in the community multiscale air quality (CMAQ) model is discussed here to provide an example of a methodology that could incorporate this variability into the risk assessment analyses. Because this approach includes a full representation of the chemical production and physical transport of ozone in the atmosphere, it does not require assumed background concentrations, which have been applied to constrain estimates from past statistical techniques. The CMAQ-HDDM adjustment approach is extended to measured ozone concentrations by determining typical sensitivities at each monitor location and hour of the day based on a linear relationship between first-order sensitivities and hourly ozone values. This approach is demonstrated by modeling ozone responses for monitor locations in Detroit and Charlotte to domain-wide reductions in anthropogenic NOx and VOCs emissions. As seen in previous studies, ozone response calculated using HDDM compared well to brute-force emissions changes up to approximately a 50% reduction in emissions. A new stepwise approach is developed here to apply this method to emissions reductions beyond 50% allowing for the simulation of more stringent reductions in ozone concentrations. Compared to previous rollback methods, this application of modeled sensitivities to ambient ozone concentrations provides a more realistic spatial response of ozone concentrations at monitors inside and outside the urban core and at hours of both high and low ozone concentrations.

  14. Statistical Analysis of the Uncertainty in Pre-Flight Aerodynamic Database of a Hypersonic Vehicle

    NASA Astrophysics Data System (ADS)

    Huh, Lynn

    The objective of the present research was to develop a new method to derive the aerodynamic coefficients and the associated uncertainties for flight vehicles via post- flight inertial navigation analysis using data from the inertial measurement unit. Statistical estimates of vehicle state and aerodynamic coefficients are derived using Monte Carlo simulation. Trajectory reconstruction using the inertial navigation system (INS) is a simple and well used method. However, deriving realistic uncertainties in the reconstructed state and any associated parameters is not so straight forward. Extended Kalman filters, batch minimum variance estimation and other approaches have been used. However, these methods generally depend on assumed physical models, assumed statistical distributions (usually Gaussian) or have convergence issues for non-linear problems. The approach here assumes no physical models, is applicable to any statistical distribution, and does not have any convergence issues. The new approach obtains the statistics directly from a sufficient number of Monte Carlo samples using only the generally well known gyro and accelerometer specifications and could be applied to the systems of non-linear form and non-Gaussian distribution. When redundant data are available, the set of Monte Carlo simulations are constrained to satisfy the redundant data within the uncertainties specified for the additional data. The proposed method was applied to validate the uncertainty in the pre-flight aerodynamic database of the X-43A Hyper-X research vehicle. In addition to gyro and acceleration data, the actual flight data include redundant measurements of position and velocity from the global positioning system (GPS). The criteria derived from the blend of the GPS and INS accuracy was used to select valid trajectories for statistical analysis. The aerodynamic coefficients were derived from the selected trajectories by either direct extraction method based on the equations in dynamics, or by the inquiry of the pre-flight aerodynamic database. After the application of the proposed method to the case of the X-43A Hyper-X research vehicle, it was found that 1) there were consistent differences in the aerodynamic coefficients from the pre-flight aerodynamic database and post-flight analysis, 2) the pre-flight estimation of the pitching moment coefficients was significantly different from the post-flight analysis, 3) the type of distribution of the states from the Monte Carlo simulation were affected by that of the perturbation parameters, 4) the uncertainties in the pre-flight model were overestimated, 5) the range where the aerodynamic coefficients from the pre-flight aerodynamic database and post-flight analysis are in closest agreement is between Mach *.* and *.* and more data points may be needed between Mach * and ** in the pre-flight aerodynamic database, 6) selection criterion for valid trajectories from the Monte Carlo simulations was mostly driven by the horizontal velocity error, 7) the selection criterion must be based on reasonable model to ensure the validity of the statistics from the proposed method, and 8) the results from the proposed method applied to the two different flights with the identical geometry and similar flight profile were consistent.

  15. The need and approach for characterization - U.S. air force perspectives on materials state awareness

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Lindgren, Eric A.

    2018-04-01

    This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.

  16. A probabilistic approach to radiative energy loss calculations for optically thick atmospheres - Hydrogen lines and continua

    NASA Technical Reports Server (NTRS)

    Canfield, R. C.; Ricchiazzi, P. J.

    1980-01-01

    An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.

  17. A Bayesian sequential processor approach to spectroscopic portal system decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sale, K; Candy, J; Breitfeller, E

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waitingmore » for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.« less

  18. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    NASA Astrophysics Data System (ADS)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  19. Single Cell Proteomics in Biomedicine: High-dimensional Data Acquisition, Visualization and Analysis

    PubMed Central

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-01-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. PMID:28128880

  20. An intelligent tutoring system for teaching fundamental physics concepts

    NASA Astrophysics Data System (ADS)

    Albacete, Patricia Lucia

    1999-12-01

    Students in traditional elementary mechanics classes can master problem solving of a quantitative nature but not those of a qualitative type. Moreover, students' naive conceptions of physics remain unchanged after completing their class. A few approaches have been implemented to improve this situation however none have met with great success. Since elementary mechanics is the foundation for all of physics and it is a required course for most science majors there is a clear need to improve the instruction of the subject. To address this problem I developed a intelligent tutoring system, called the Conceptual Helper, which coaches students during homework problem solving. The tutor uses a unique cognitive based approach to teaching physics, which presents innovations in three areas. (1) The teaching strategy, which focuses on teaching those links among the concepts of the domain that are essential for conceptual understanding yet are seldom learned by the students. (2) The manner in which the knowledge is taught, which is based on a combination of effective human tutoring techniques (e.g., hinting), effective pedagogical methods (e.g., a microscopic view of matter), and less cognitively demanding approaches (e.g., anthropomorphism). (3) The way in which misconceptions are handled which uses the underlying scientific correct line of reasoning to describe to the student the phenomenon that is the basis for the misconception. From a technological point of view the Conceptual Helper was implemented as a model-tracing tutor which intervenes when students make errors and after completion of each problem, at which time the tutor scaffolds the students on post-problem reflection. The remediation is guided by probabilistic assessment of mastery and the interventions are adapted to the errors. The thesis also presents the results of the evaluation of the system which revealed that the gain scores of the experimental group were statistically significantly higher than those of the control group, suggesting that the Conceptual Helper was indeed capable of effectively teaching the conceptual aspects of physics as well as helped students abandon common misconceptions. Furthermore, the evaluation showed that the students' performance on a standardized test was comparable to those of other more complex approaches.

  1. A comparison of two approaches to modelling snow cover dynamics at the Polish Polar Station at Hornsund

    NASA Astrophysics Data System (ADS)

    Luks, B.; Osuch, M.; Romanowicz, R. J.

    2012-04-01

    We compare two approaches to modelling snow cover dynamics at the Polish Polar Station at Hornsund. In the first approach we apply physically-based Utah Energy Balance Snow Accumulation and Melt Model (UEB) (Tarboton et al., 1995; Tarboton and Luce, 1996). The model uses a lumped representation of the snowpack with two primary state variables: snow water equivalence and energy. Its main driving inputs are: air temperature, precipitation, wind speed, humidity and radiation (estimated from the diurnal temperature range). Those variables are used for physically-based calculations of radiative, sensible, latent and advective heat exchanges with a 3 hours time step. The second method is an application of a statistically efficient lumped parameter time series approach to modelling the dynamics of snow cover , based on daily meteorological measurements from the same area. A dynamic Stochastic Transfer Function model is developed that follows the Data Based Mechanistic approach, where a stochastic data-based identification of model structure and an estimation of its parameters are followed by a physical interpretation. We focus on the analysis of uncertainty of both model outputs. In the time series approach, the applied techniques also provide estimates of the modeling errors and the uncertainty of the model parameters. In the first, physically-based approach the applied UEB model is deterministic. It assumes that the observations are without errors and that the model structure perfectly describes the processes within the snowpack. To take into account the model and observation errors, we applied a version of the Generalized Likelihood Uncertainty Estimation technique (GLUE). This technique also provide estimates of the modelling errors and the uncertainty of the model parameters. The observed snowpack water equivalent values are compared with those simulated with 95% confidence bounds. This work was supported by National Science Centre of Poland (grant no. 7879/B/P01/2011/40). Tarboton, D. G., T. G. Chowdhury and T. H. Jackson, 1995. A Spatially Distributed Energy Balance Snowmelt Model. In K. A. Tonnessen, M. W. Williams and M. Tranter (Ed.), Proceedings of a Boulder Symposium, July 3-14, IAHS Publ. no. 228, pp. 141-155. Tarboton, D. G. and C. H. Luce, 1996. Utah Energy Balance Snow Accumulation and Melt Model (UEB). Computer model technical description and users guide, Utah Water Research Laboratory and USDA Forest Service Intermountain Research Station (http://www.engineering.usu.edu/dtarb/). 64 pp.

  2. Teaching dementia care to physical therapy doctoral students: A multimodal experiential learning approach.

    PubMed

    Lorio, Anne K; Gore, Jane B; Warthen, Lindsey; Housley, Stephen N; Burgess, Elisabeth O

    2017-01-01

    As the population aged 65 and older grows, it becomes imperative for health care providers to expand their knowledge regarding geriatric conditions and concerns. Dementia is a devastating degenerative disease process that is affecting millions of individuals in the United States, with significant economic and emotional burden on family and caregivers. The need for further dementia education in physical therapy school is essential to improve attitudes and treatment that affect patient outcomes and quality of care. This physical therapy program implemented a 12-hour multimodal experiential learning module designed to educate their students on the challenges associated with dementia to increase knowledge and confidence when treating these patients. The results of this study showed statistically significant improvements in overall confidence and knowledge of treating patients with dementia. The study finds the addition of experiential learning to traditional didactic coursework improves students' reported confidence in working with patients with dementia and understanding the challenges associated with treating patients with dementia.

  3. SDE decomposition and A-type stochastic interpretation in nonequilibrium processes

    NASA Astrophysics Data System (ADS)

    Yuan, Ruoshi; Tang, Ying; Ao, Ping

    2017-12-01

    An innovative theoretical framework for stochastic dynamics based on the decomposition of a stochastic differential equation (SDE) into a dissipative component, a detailed-balance-breaking component, and a dual-role potential landscape has been developed, which has fruitful applications in physics, engineering, chemistry, and biology. It introduces the A-type stochastic interpretation of the SDE beyond the traditional Ito or Stratonovich interpretation or even the α-type interpretation for multidimensional systems. The potential landscape serves as a Hamiltonian-like function in nonequilibrium processes without detailed balance, which extends this important concept from equilibrium statistical physics to the nonequilibrium region. A question on the uniqueness of the SDE decomposition was recently raised. Our review of both the mathematical and physical aspects shows that uniqueness is guaranteed. The demonstration leads to a better understanding of the robustness of the novel framework. In addition, we discuss related issues including the limitations of an approach to obtaining the potential function from a steady-state distribution.

  4. Science of Ball Lightning (Fire Ball)

    NASA Astrophysics Data System (ADS)

    Ohtsuki, Yoshi-Hiko

    1989-08-01

    The Table of Contents for the full book PDF is as follows: * Organizing Committee * Preface * Ball Lightning -- The Continuing Challenge * Hungarian Ball Lightning Observations in 1987 * Nature of Ball Lightning in Japan * Phenomenological and Psychological Analysis of 150 Austrian Ball Lightning Reports * Physical Problems and Physical Properties of Ball Lightning * Statistical Analysis of the Ball Lightning Properties * A Fluid-Dynamical Model for Ball Lightning and Bead Lightning * The Lifetime of Hill's Vortex * Electrical and Radiative Properties of Ball Lightning * The Candle Flame as a Model of Ball Lightning * A Model for Ball Lightning * The High-Temperature Physico-Chemical Processes in the Lightning Storm Atmosphere (A Physico-Chemical Model of Ball Lightning) * New Approach to Ball Lightning * A Calculation of Electric Field of Ball Lightning * The Physical Explanation to the UFO over Xinjiang, Northern West China * Electric Reconnection, Critical Ionization Velocity, Ponderomotive Force, and Their Applications to Triggered and Ball Lightning * The PLASMAK™ Configuration and Ball Lightning * Experimental Research on Ball Lightning * Performance of High-Voltage Test Facility Designed for Investigation of Ball Lightning * List of Participants

  5. Insights into teaching quantum mechanics in secondary and lower undergraduate education

    NASA Astrophysics Data System (ADS)

    Krijtenburg-Lewerissa, K.; Pol, H. J.; Brinkman, A.; van Joolingen, W. R.

    2017-06-01

    This study presents a review of the current state of research on teaching quantum mechanics in secondary and lower undergraduate education. A conceptual approach to quantum mechanics is being implemented in more and more introductory physics courses around the world. Because of the differences between the conceptual nature of quantum mechanics and classical physics, research on misconceptions, testing, and teaching strategies for introductory quantum mechanics is needed. For this review, 74 articles were selected and analyzed for the misconceptions, research tools, teaching strategies, and multimedia applications investigated. Outcomes were categorized according to their contribution to the various subtopics of quantum mechanics. Analysis shows that students have difficulty relating quantum physics to physical reality. It also shows that the teaching of complex quantum behavior, such as time dependence, superposition, and the measurement problem, has barely been investigated for the secondary and lower undergraduate level. At the secondary school level, this article shows a need to investigate student difficulties concerning wave functions and potential wells. Investigation of research tools shows the necessity for the development of assessment tools for secondary and lower undergraduate education, which cover all major topics and are suitable for statistical analysis. Furthermore, this article shows the existence of very diverse ideas concerning teaching strategies for quantum mechanics and a lack of research into which strategies promote understanding. This article underlines the need for more empirical research into student difficulties, teaching strategies, activities, and research tools intended for a conceptual approach for quantum mechanics.

  6. Moment-Based Physical Models of Broadband Clutter due to Aggregations of Fish

    DTIC Science & Technology

    2013-09-30

    statistical models for signal-processing algorithm development. These in turn will help to develop a capability to statistically forecast the impact of...aggregations of fish based on higher-order statistical measures describable in terms of physical and system parameters. Environmentally , these models...processing. In this experiment, we had good ground truth on (1) and (2), and had control over (3) and (4) except for environmentally -imposed restrictions

  7. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1988-01-01

    Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.

  8. Integrating Space with Place in Health Research: A Multilevel Spatial Investigation Using Child Mortality in 1880 Newark, New Jersey

    PubMed Central

    Xu, Hongwei; Logan, John R.; Short, Susan E.

    2014-01-01

    Research on neighborhoods and health increasingly acknowledges the need to conceptualize, measure, and model spatial features of social and physical environments. In ignoring underlying spatial dynamics, we run the risk of biased statistical inference and misleading results. In this paper, we propose an integrated multilevel-spatial approach for Poisson models of discrete responses. In an empirical example of child mortality in 1880 Newark, New Jersey, we compare this multilevel-spatial approach with the more typical aspatial multilevel approach. Results indicate that spatially-defined egocentric neighborhoods, or distance-based measures, outperform administrative areal units, such as census units. In addition, although results did not vary by specific definitions of egocentric neighborhoods, they were sensitive to geographic scale and modeling strategy. Overall, our findings confirm that adopting a spatial-multilevel approach enhances our ability to disentangle the effect of space from that of place, and point to the need for more careful spatial thinking in population research on neighborhoods and health. PMID:24763980

  9. A simple rapid approach using coupled multivariate statistical methods, GIS and trajectory models to delineate areas of common oil spill risk

    NASA Astrophysics Data System (ADS)

    Guillen, George; Rainey, Gail; Morin, Michelle

    2004-04-01

    Currently, the Minerals Management Service uses the Oil Spill Risk Analysis model (OSRAM) to predict the movement of potential oil spills greater than 1000 bbl originating from offshore oil and gas facilities. OSRAM generates oil spill trajectories using meteorological and hydrological data input from either actual physical measurements or estimates generated from other hydrological models. OSRAM and many other models produce output matrices of average, maximum and minimum contact probabilities to specific landfall or target segments (columns) from oil spills at specific points (rows). Analysts and managers are often interested in identifying geographic areas or groups of facilities that pose similar risks to specific targets or groups of targets if a spill occurred. Unfortunately, due to the potentially large matrix generated by many spill models, this question is difficult to answer without the use of data reduction and visualization methods. In our study we utilized a multivariate statistical method called cluster analysis to group areas of similar risk based on potential distribution of landfall target trajectory probabilities. We also utilized ArcView™ GIS to display spill launch point groupings. The combination of GIS and multivariate statistical techniques in the post-processing of trajectory model output is a powerful tool for identifying and delineating areas of similar risk from multiple spill sources. We strongly encourage modelers, statistical and GIS software programmers to closely collaborate to produce a more seamless integration of these technologies and approaches to analyzing data. They are complimentary methods that strengthen the overall assessment of spill risks.

  10. Addressing the mischaracterization of extreme rainfall in regional climate model simulations - A synoptic pattern based bias correction approach

    NASA Astrophysics Data System (ADS)

    Li, Jingwan; Sharma, Ashish; Evans, Jason; Johnson, Fiona

    2018-01-01

    Addressing systematic biases in regional climate model simulations of extreme rainfall is a necessary first step before assessing changes in future rainfall extremes. Commonly used bias correction methods are designed to match statistics of the overall simulated rainfall with observations. This assumes that change in the mix of different types of extreme rainfall events (i.e. convective and non-convective) in a warmer climate is of little relevance in the estimation of overall change, an assumption that is not supported by empirical or physical evidence. This study proposes an alternative approach to account for the potential change of alternate rainfall types, characterized here by synoptic weather patterns (SPs) using self-organizing maps classification. The objective of this study is to evaluate the added influence of SPs on the bias correction, which is achieved by comparing the corrected distribution of future extreme rainfall with that using conventional quantile mapping. A comprehensive synthetic experiment is first defined to investigate the conditions under which the additional information of SPs makes a significant difference to the bias correction. Using over 600,000 synthetic cases, statistically significant differences are found to be present in 46% cases. This is followed by a case study over the Sydney region using a high-resolution run of the Weather Research and Forecasting (WRF) regional climate model, which indicates a small change in the proportions of the SPs and a statistically significant change in the extreme rainfall over the region, although the differences between the changes obtained from the two bias correction methods are not statistically significant.

  11. Self-consistent field theory simulations of polymers on arbitrary domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ouaknin, Gaddiel, E-mail: gaddielouaknin@umail.ucsb.edu; Laachi, Nabil; Delaney, Kris

    2016-12-15

    We introduce a framework for simulating the mesoscale self-assembly of block copolymers in arbitrary confined geometries subject to Neumann boundary conditions. We employ a hybrid finite difference/volume approach to discretize the mean-field equations on an irregular domain represented implicitly by a level-set function. The numerical treatment of the Neumann boundary conditions is sharp, i.e. it avoids an artificial smearing in the irregular domain boundary. This strategy enables the study of self-assembly in confined domains and enables the computation of physically meaningful quantities at the domain interface. In addition, we employ adaptive grids encoded with Quad-/Oc-trees in parallel to automatically refinemore » the grid where the statistical fields vary rapidly as well as at the boundary of the confined domain. This approach results in a significant reduction in the number of degrees of freedom and makes the simulations in arbitrary domains using effective boundary conditions computationally efficient in terms of both speed and memory requirement. Finally, in the case of regular periodic domains, where pseudo-spectral approaches are superior to finite differences in terms of CPU time and accuracy, we use the adaptive strategy to store chain propagators, reducing the memory footprint without loss of accuracy in computed physical observables.« less

  12. q-bosons and the q-analogue quantized field

    NASA Technical Reports Server (NTRS)

    Nelson, Charles A.

    1995-01-01

    The q-analogue coherent states are used to identify physical signatures for the presence of a 1-analogue quantized radiation field in the q-CS classical limits where the absolute value of z is large. In this quantum-optics-like limit, the fractional uncertainties of most physical quantities (momentum, position, amplitude, phase) which characterize the quantum field are O(1). They only vanish as O(1/absolute value of z) when q = 1. However, for the number operator, N, and the N-Hamiltonian for a free q-boson gas, H(sub N) = h(omega)(N + 1/2), the fractional uncertainties do still approach zero. A signature for q-boson counting statistics is that (Delta N)(exp 2)/ (N) approaches 0 as the absolute value of z approaches infinity. Except for its O(1) fractional uncertainty, the q-generalization of the Hermitian phase operator of Pegg and Barnett, phi(sub q), still exhibits normal classical behavior. The standard number-phase uncertainty-relation, Delta(N) Delta phi(sub q) = 1/2, and the approximate commutation relation, (N, phi(sub q)) = i, still hold for the single-mode q-analogue quantized field. So, N and phi(sub q) are almost canonically conjugate operators in the q-CS classical limit. The q-analogue CS's minimize this uncertainty relation for moderate (absolute value of z)(exp 2).

  13. Stochastic analysis of surface roughness models in quantum wires

    NASA Astrophysics Data System (ADS)

    Nedjalkov, Mihail; Ellinghaus, Paul; Weinbub, Josef; Sadi, Toufik; Asenov, Asen; Dimov, Ivan; Selberherr, Siegfried

    2018-07-01

    We present a signed particle computational approach for the Wigner transport model and use it to analyze the electron state dynamics in quantum wires focusing on the effect of surface roughness. Usually surface roughness is considered as a scattering model, accounted for by the Fermi Golden Rule, which relies on approximations like statistical averaging and in the case of quantum wires incorporates quantum corrections based on the mode space approach. We provide a novel computational approach to enable physical analysis of these assumptions in terms of phase space and particles. Utilized is the signed particles model of Wigner evolution, which, besides providing a full quantum description of the electron dynamics, enables intuitive insights into the processes of tunneling, which govern the physical evolution. It is shown that the basic assumptions of the quantum-corrected scattering model correspond to the quantum behavior of the electron system. Of particular importance is the distribution of the density: Due to the quantum confinement, electrons are kept away from the walls, which is in contrast to the classical scattering model. Further quantum effects are retardation of the electron dynamics and quantum reflection. Far from equilibrium the assumption of homogeneous conditions along the wire breaks even in the case of ideal wire walls.

  14. Modelling the participation decision and duration of sporting activity in Scotland

    PubMed Central

    Eberth, Barbara; Smith, Murray D.

    2010-01-01

    Motivating individuals to actively engage in physical activity due to its beneficial health effects has been an integral part of Scotland's health policy agenda. The current Scottish guidelines recommend individuals participate in physical activity of moderate vigour for 30 min at least five times per week. For an individual contemplating the recommendation, decisions have to be made in regard of participation, intensity, duration and multiplicity. For the policy maker, understanding the determinants of each decision will assist in designing an intervention to effect the recommended policy. With secondary data sourced from the 2003 Scottish Health Survey (SHeS) we statistically model the combined decisions process, employing a copula approach to model specification. In taking this approach the model flexibly accounts for any statistical associations that may exist between the component decisions. Thus, we model the endogenous relationship between the decision of individuals to participate in sporting activities and, amongst those who participate, the duration of time spent undertaking their chosen activities. The main focus is to establish whether dependence exists between the two random variables assuming the vigour with which sporting activity is performed to be independent of the participation and duration decision. We allow for a variety of controls including demographic factors such as age and gender, economic factors such as income and educational attainment, lifestyle factors such as smoking, alcohol consumption, healthy eating and medical history. We use the model to compare the effect of interventions designed to increase the vigour with which individuals undertake their sport, relating it to obesity as a health outcome. PMID:20640033

  15. Thermodynamics of mixtures of patchy and spherical colloids of different sizes: A multi-body association theory with complete reference fluid information.

    PubMed

    Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D; Cox, Kenneth R; Chapman, Walter G

    2017-04-28

    We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.

  16. Thermodynamics of mixtures of patchy and spherical colloids of different sizes: A multi-body association theory with complete reference fluid information

    NASA Astrophysics Data System (ADS)

    Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D.; Cox, Kenneth R.; Chapman, Walter G.

    2017-04-01

    We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.

  17. Automatically Characterizing Sensory-Motor Patterns Underlying Reach-to-Grasp Movements on a Physical Depth Inversion Illusion.

    PubMed

    Nguyen, Jillian; Majmudar, Ushma V; Ravaliya, Jay H; Papathomas, Thomas V; Torres, Elizabeth B

    2015-01-01

    Recently, movement variability has been of great interest to motor control physiologists as it constitutes a physical, quantifiable form of sensory feedback to aid in planning, updating, and executing complex actions. In marked contrast, the psychological and psychiatric arenas mainly rely on verbal descriptions and interpretations of behavior via observation. Consequently, a large gap exists between the body's manifestations of mental states and their descriptions, creating a disembodied approach in the psychological and neural sciences: contributions of the peripheral nervous system to central control, executive functions, and decision-making processes are poorly understood. How do we shift from a psychological, theorizing approach to characterize complex behaviors more objectively? We introduce a novel, objective, statistical framework, and visuomotor control paradigm to help characterize the stochastic signatures of minute fluctuations in overt movements during a visuomotor task. We also quantify a new class of covert movements that spontaneously occur without instruction. These are largely beneath awareness, but inevitably present in all behaviors. The inclusion of these motions in our analyses introduces a new paradigm in sensory-motor integration. As it turns out, these movements, often overlooked as motor noise, contain valuable information that contributes to the emergence of different kinesthetic percepts. We apply these new methods to help better understand perception-action loops. To investigate how perceptual inputs affect reach behavior, we use a depth inversion illusion (DII): the same physical stimulus produces two distinct depth percepts that are nearly orthogonal, enabling a robust comparison of competing percepts. We find that the moment-by-moment empirically estimated motor output variability can inform us of the participants' perceptual states, detecting physiologically relevant signals from the peripheral nervous system that reveal internal mental states evoked by the bi-stable illusion. Our work proposes a new statistical platform to objectively separate changes in visual perception by quantifying the unfolding of movement, emphasizing the importance of including in the motion analyses all overt and covert aspects of motor behavior.

  18. Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models

    PubMed Central

    Kougioumtzoglou, Ioannis A.; Stokes, Chris R.; Smith, Michael J.; Clark, Chris D.; Spagnolo, Matteo S.

    2016-01-01

    Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL). By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A ‘stochastic instability’ (SI) model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity). Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models. PMID:27458921

  19. Covariation of depressive mood and spontaneous physical activity in major depressive disorder: toward continuous monitoring of depressive mood.

    PubMed

    Kim, Jinhyuk; Nakamura, Toru; Kikuchi, Hiroe; Yoshiuchi, Kazuhiro; Sasaki, Tsukasa; Yamamoto, Yoshiharu

    2015-07-01

    The objective evaluation of depressive mood is considered to be useful for the diagnosis and treatment of depressive disorders. Thus, we investigated psychobehavioral correlates, particularly the statistical associations between momentary depressive mood and behavioral dynamics measured objectively, in patients with major depressive disorder (MDD) and healthy subjects. Patients with MDD ( n = 14) and healthy subjects ( n = 43) wore a watch-type computer device and rated their momentary symptoms using ecological momentary assessment. Spontaneous physical activity in daily life, referred to as locomotor activity, was also continuously measured by an activity monitor built into the device. A multilevel modeling approach was used to model the associations between changes in depressive mood scores and the local statistics of locomotor activity simultaneously measured. We further examined the cross validity of such associations across groups. The statistical model established indicated that worsening of the depressive mood was associated with the increased intermittency of locomotor activity, as characterized by a lower mean and higher skewness. The model was cross validated across groups, suggesting that the same psychobehavioral correlates are shared by both healthy subjects and patients, although the latter had significantly higher mean levels of depressive mood scores. Our findings suggest the presence of robust as well as common associations between momentary depressive mood and behavioral dynamics in healthy individuals and patients with depression, which may lead to the continuous monitoring of the pathogenic processes (from healthy states) and pathological states of MDD.

  20. Dynamic and thermodynamic processes driving the January 2014 precipitation record in southern UK

    NASA Astrophysics Data System (ADS)

    Oueslati, B.; Yiou, P.; Jezequel, A.

    2017-12-01

    Regional extreme precipitation are projected to intensify as a response to planetary climate change, with important impacts on societies. Understanding and anticipating those events remain a major challenge. In this study, we revisit the mechanisms of winter precipitation record that occurred in southern United Kingdom in January 2014. The physical drivers of this event are analyzed using the water vapor budget. Precipitation changes are decomposed into dynamic contributions, related to changes in atmospheric circulation, and thermodynamic contributions, related to changes in water vapor. We attempt to quantify the relative importance of the two contributions during this event and examine the applicability of Clausius-Clapeyron scaling. This work provides a physical interpretation of the mechanisms associated with Southern UK's wettest event, which is complementary to other studies based on statistical approaches (Schaller et al., 2016, Yiou et al., 2017). The analysis is carried out using the ERA-Interim reanalysis. This is motivated by the horizontal resolution of this dataset. It is then applied to present-day simulations and future projections of CMIP5 models on selected extreme precipitation events in southern UK that are comparable to January 2014 in terms of atmospheric circulation.References:Schaller, N. et al. Human influence on climate in the 2014 southern England winter floods and their impacts, Nature Clim. Change, 2016, 6, 627-634 Yiou, P., et al. A statistical framework for conditional extreme event attribution Advances in Statistical Climatology, Meteorology and Oceanography, 2017, 3, 17-31

  1. Stability of knotted vortices in wave chaos

    NASA Astrophysics Data System (ADS)

    Taylor, Alexander; Dennis, Mark

    Large scale tangles of disordered filaments occur in many diverse physical systems, from turbulent superfluids to optical volume speckle to liquid crystal phases. They can exhibit particular large scale random statistics despite very different local physics. We have previously used the topological statistics of knotting and linking to characterise the large scale tangling, using the vortices of three-dimensional wave chaos as a universal model system whose physical lengthscales are set only by the wavelength. Unlike geometrical quantities, the statistics of knotting depend strongly on the physical system and boundary conditions. Although knotting patterns characterise different systems, the topology of vortices is highly unstable to perturbation, under which they may reconnect with one another. In systems of constructed knots, these reconnections generally rapidly destroy the knot, but for vortex tangles the topological statistics must be stable. Using large scale simulations of chaotic eigenfunctions, we numerically investigate the prevalence and impact of reconnection events, and their effect on the topology of the tangle.

  2. Statistical physics of the symmetric group.

    PubMed

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  3. Statistical physics of the symmetric group

    NASA Astrophysics Data System (ADS)

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  4. A pedestrian approach to the measurement problem in quantum mechanics

    NASA Astrophysics Data System (ADS)

    Boughn, Stephen; Reginatto, Marcel

    2013-09-01

    The quantum theory of measurement has been a matter of debate for over eighty years. Most of the discussion has focused on theoretical issues with the consequence that other aspects (such as the operational prescriptions that are an integral part of experimental physics) have been largely ignored. This has undoubtedly exacerbated attempts to find a solution to the "measurement problem". How the measurement problem is defined depends to some extent on how the theoretical concepts introduced by the theory are interpreted. In this paper, we fully embrace the minimalist statistical (ensemble) interpretation of quantum mechanics espoused by Einstein, Ballentine, and others. According to this interpretation, the quantum state description applies only to a statistical ensemble of similarly prepared systems rather than representing an individual system. Thus, the statistical interpretation obviates the need to entertain reduction of the state vector, one of the primary dilemmas of the measurement problem. The other major aspect of the measurement problem, the necessity of describing measurements in terms of classical concepts that lay outside of quantum theory, remains. A consistent formalism for interacting quantum and classical systems, like the one based on ensembles on configuration space that we refer to in this paper, might seem to eliminate this facet of the measurement problem; however, we argue that the ultimate interface with experiments is described by operational prescriptions and not in terms of the concepts of classical theory. There is no doubt that attempts to address the measurement problem have yielded important advances in fundamental physics; however, it is also very clear that the measurement problem is still far from being resolved. The pedestrian approach presented here suggests that this state of affairs is in part the result of searching for a theoretical/mathematical solution to what is fundamentally an experimental/observational question. It suggests also that the measurement problem is, in some sense, ill-posed and might never be resolved. This point of view is tenable so long as one is willing to view physical theories as providing models of nature rather than complete descriptions of reality. Among other things, these considerations lead us to suggest that the Copenhagen interpretation's insistence on the classicality of the measurement apparatus should be replaced by the requirement that a measurement, which is specified operationally, should simply be of sufficient precision.

  5. Estimating urban ground-level PM10 using MODIS 3km AOD product and meteorological parameters from WRF model

    NASA Astrophysics Data System (ADS)

    Ghotbi, Saba; Sotoudeheian, Saeed; Arhami, Mohammad

    2016-09-01

    Satellite remote sensing products of AOD from MODIS along with appropriate meteorological parameters were used to develop statistical models and estimate ground-level PM10. Most of previous studies obtained meteorological data from synoptic weather stations, with rather sparse spatial distribution, and used it along with 10 km AOD product to develop statistical models, applicable for PM variations in regional scale (resolution of ≥10 km). In the current study, meteorological parameters were simulated with 3 km resolution using WRF model and used along with the rather new 3 km AOD product (launched in 2014). The resulting PM statistical models were assessed for a polluted and largely variable urban area, Tehran, Iran. Despite the critical particulate pollution problem, very few PM studies were conducted in this area. The issue of rather poor direct PM-AOD associations existed, due to different factors such as variations in particles optical properties, in addition to bright background issue for satellite data, as the studied area located in the semi-arid areas of Middle East. Statistical approach of linear mixed effect (LME) was used, and three types of statistical models including single variable LME model (using AOD as independent variable) and multiple variables LME model by using meteorological data from two sources, WRF model and synoptic stations, were examined. Meteorological simulations were performed using a multiscale approach and creating an appropriate physic for the studied region, and the results showed rather good agreements with recordings of the synoptic stations. The single variable LME model was able to explain about 61%-73% of daily PM10 variations, reflecting a rather acceptable performance. Statistical models performance improved through using multivariable LME and incorporating meteorological data as auxiliary variables, particularly by using fine resolution outputs from WRF (R2 = 0.73-0.81). In addition, rather fine resolution for PM estimates was mapped for the studied city, and resulting concentration maps were consistent with PM recordings at the existing stations.

  6. Global CO2 flux inversions from remote-sensing data with systematic errors using hierarchical statistical models

    NASA Astrophysics Data System (ADS)

    Zammit-Mangion, Andrew; Stavert, Ann; Rigby, Matthew; Ganesan, Anita; Rayner, Peter; Cressie, Noel

    2017-04-01

    The Orbiting Carbon Observatory-2 (OCO-2) satellite was launched on 2 July 2014, and it has been a source of atmospheric CO2 data since September 2014. The OCO-2 dataset contains a number of variables, but the one of most interest for flux inversion has been the column-averaged dry-air mole fraction (in units of ppm). These global level-2 data offer the possibility of inferring CO2 fluxes at Earth's surface and tracking those fluxes over time. However, as well as having a component of random error, the OCO-2 data have a component of systematic error that is dependent on the instrument's mode, namely land nadir, land glint, and ocean glint. Our statistical approach to CO2-flux inversion starts with constructing a statistical model for the random and systematic errors with parameters that can be estimated from the OCO-2 data and possibly in situ sources from flasks, towers, and the Total Column Carbon Observing Network (TCCON). Dimension reduction of the flux field is achieved through the use of physical basis functions, while temporal evolution of the flux is captured by modelling the basis-function coefficients as a vector autoregressive process. For computational efficiency, flux inversion uses only three months of sensitivities of mole fraction to changes in flux, computed using MOZART; any residual variation is captured through the modelling of a stochastic process that varies smoothly as a function of latitude. The second stage of our statistical approach is to simulate from the posterior distribution of the basis-function coefficients and all unknown parameters given the data using a fully Bayesian Markov chain Monte Carlo (MCMC) algorithm. Estimates and posterior variances of the flux field can then be obtained straightforwardly from this distribution. Our statistical approach is different than others, as it simultaneously makes inference (and quantifies uncertainty) on both the error components' parameters and the CO2 fluxes. We compare it to more classical approaches through an Observing System Simulation Experiment (OSSE) on a global scale. By changing the size of the random and systematic errors in the OSSE, we can determine the corresponding spatial and temporal resolutions at which useful flux signals could be detected from the OCO-2 data.

  7. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  8. Searching for ``Preparation for Future Learning'' in Physics

    NASA Astrophysics Data System (ADS)

    Etkina, Eugenia; Gentile, Michael; Karelina, Anna; Ruibal-Villasenor, Maria R.; Suran, Gregory

    2009-11-01

    "Preparation for future learning" is a term describing a new approach to transfer. In addition to focusing on learning environments that help students better apply developed knowledge in new situations; education researchers are searching for educational interventions that better prepare students to learn new information. The pioneering studies in this field were conducted by J. Branford and D. Schwartz in psychology and mathematics, specifically in the area of statistics. They found that students who engaged in innovation before being exposed to new material, learned better. We attempted to replicate their experiments in the field of physics, specifically in the area of conductivity. Using two experimental conditions and one control, we compared student learning of thermal and electrical conductivity from a written text. We present the results of groups' performance on seven qualitative questions after their learning in this area.

  9. Regularly arranged indium islands on glass/molybdenum substrates upon femtosecond laser and physical vapor deposition processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringleb, F.; Eylers, K.; Teubner, Th.

    2016-03-14

    A bottom-up approach is presented for the production of arrays of indium islands on a molybdenum layer on glass, which can serve as micro-sized precursors for indium compounds such as copper-indium-gallium-diselenide used in photovoltaics. Femtosecond laser ablation of glass and a subsequent deposition of a molybdenum film or direct laser processing of the molybdenum film both allow the preferential nucleation and growth of indium islands at the predefined locations in a following indium-based physical vapor deposition (PVD) process. A proper choice of laser and deposition parameters ensures the controlled growth of indium islands exclusively at the laser ablated spots. Basedmore » on a statistical analysis, these results are compared to the non-structured molybdenum surface, leading to randomly grown indium islands after PVD.« less

  10. Bubbles, shocks and elementary technical trading strategies

    NASA Astrophysics Data System (ADS)

    Fry, John

    2014-01-01

    In this paper we provide a unifying framework for a set of seemingly disparate models for bubbles, shocks and elementary technical trading strategies in financial markets. Markets operate by balancing intrinsic levels of risk and return. This seemingly simple observation is commonly over-looked by academics and practitioners alike. Our model shares its origins in statistical physics with others. However, under our approach, changes in market regime can be explicitly shown to represent a phase transition from random to deterministic behaviour in prices. This structure leads to an improved physical and econometric model. We develop models for bubbles, shocks and elementary technical trading strategies. The list of empirical applications is both interesting and topical and includes real-estate bubbles and the on-going Eurozone crisis. We close by comparing the results of our model with purely qualitative findings from the finance literature.

  11. A simple approach to nonlinear estimation of physical systems

    USGS Publications Warehouse

    Christakos, G.

    1988-01-01

    Recursive algorithms for estimating the states of nonlinear physical systems are developed. This requires some key hypotheses regarding the structure of the underlying processes. Members of this class of random processes have several desirable properties for the nonlinear estimation of random signals. An assumption is made about the form of the estimator, which may then take account of a wide range of applications. Under the above assumption, the estimation algorithm is mathematically suboptimal but effective and computationally attractive. It may be compared favorably to Taylor series-type filters, nonlinear filters which approximate the probability density by Edgeworth or Gram-Charlier series, as well as to conventional statistical linearization-type estimators. To link theory with practice, some numerical results for a simulated system are presented, in which the responses from the proposed and the extended Kalman algorithms are compared. ?? 1988.

  12. Effectiveness of Neuromuscular Electrical Stimulation on Patients With Dysphagia With Medullary Infarction.

    PubMed

    Zhang, Ming; Tao, Tao; Zhang, Zhao-Bo; Zhu, Xiao; Fan, Wen-Guo; Pu, Li-Jun; Chu, Lei; Yue, Shou-Wei

    2016-03-01

    To evaluate and compare the effects of neuromuscular electrical stimulation (NMES) acting on the sensory input or motor muscle in treating patients with dysphagia with medullary infarction. Prospective randomized controlled study. Department of physical medicine and rehabilitation. Patients with dysphagia with medullary infarction (N=82). Participants were randomized over 3 intervention groups: traditional swallowing therapy, sensory approach combined with traditional swallowing therapy, and motor approach combined with traditional swallowing therapy. Electrical stimulation sessions were for 20 minutes, twice a day, for 5d/wk, over a 4-week period. Swallowing function was evaluated by the water swallow test and Standardized Swallowing Assessment, oral intake was evaluated by the Functional Oral Intake Scale, quality of life was evaluated by the Swallowing-Related Quality of Life (SWAL-QOL) Scale, and cognition was evaluated by the Mini-Mental State Examination (MMSE). There were no statistically significant differences between the groups in age, sex, duration, MMSE score, or severity of the swallowing disorder (P>.05). All groups showed improved swallowing function (P≤.01); the sensory approach combined with traditional swallowing therapy group showed significantly greater improvement than the other 2 groups, and the motor approach combined with traditional swallowing therapy group showed greater improvement than the traditional swallowing therapy group (P<.05). SWAL-QOL Scale scores increased more significantly in the sensory approach combined with traditional swallowing therapy and motor approach combined with traditional swallowing therapy groups than in the traditional swallowing therapy group, and the sensory approach combined with traditional swallowing therapy and motor approach combined with traditional swallowing therapy groups showed statistically significant differences (P=.04). NMES that targets either sensory input or motor muscle coupled with traditional therapy is conducive to recovery from dysphagia and improves quality of life for patients with dysphagia with medullary infarction. A sensory approach appears to be better than a motor approach. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  13. The effects of estimation of censoring, truncation, transformation and partial data vectors

    NASA Technical Reports Server (NTRS)

    Hartley, H. O.; Smith, W. B.

    1972-01-01

    The purpose of this research was to attack statistical problems concerning the estimation of distributions for purposes of predicting and measuring assembly performance as it appears in biological and physical situations. Various statistical procedures were proposed to attack problems of this sort, that is, to produce the statistical distributions of the outcomes of biological and physical situations which, employ characteristics measured on constituent parts. The techniques are described.

  14. Statistical Physics for Adaptive Distributed Control

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A viewgraph presentation on statistical physics for distributed adaptive control is shown. The topics include: 1) The Golden Rule; 2) Advantages; 3) Roadmap; 4) What is Distributed Control? 5) Review of Information Theory; 6) Iterative Distributed Control; 7) Minimizing L(q) Via Gradient Descent; and 8) Adaptive Distributed Control.

  15. Nonextensive models for earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, R.; Franca, G.S.; Vilar, C.S.

    2006-02-15

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment {epsilon}{proportional_to}r{sup 3}. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofisica.more » Although both approaches provide very similar values for the nonextensive parameter q, other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.« less

  16. Information loss method to measure node similarity in networks

    NASA Astrophysics Data System (ADS)

    Li, Yongli; Luo, Peng; Wu, Chong

    2014-09-01

    Similarity measurement for the network node has been paid increasing attention in the field of statistical physics. In this paper, we propose an entropy-based information loss method to measure the node similarity. The whole model is established based on this idea that less information loss is caused by seeing two more similar nodes as the same. The proposed new method has relatively low algorithm complexity, making it less time-consuming and more efficient to deal with the large scale real-world network. In order to clarify its availability and accuracy, this new approach was compared with some other selected approaches on two artificial examples and synthetic networks. Furthermore, the proposed method is also successfully applied to predict the network evolution and predict the unknown nodes' attributions in the two application examples.

  17. A Scientific Approach to the Investigation on Anomalous Atmospheric Light Phenomena

    NASA Astrophysics Data System (ADS)

    Teodorani, M.

    2011-12-01

    Anomalous atmospheric light phenomena tend to occur recurrently in several places of our planet. Statistical studies show that a phenomenon's real recurrence area can be identified only after pondering reported cases on the population number and on the diffusion of communication media. The main scientific results that have been obtained so far after explorative instrumented missions have been carried out are presented, including the empirical models that have been set up in order to describe the observed reality. Subsequently, a focused theorization is discussed in order to attack the physical problem concerning the structure and the dynamics of "light balls" and the enigma related to the central force that maintains them in spherical shape. Finally, several important issues are discussed regarding methodology, strategy, tactics and interdisciplinary approaches.

  18. Nonextensive models for earthquakes.

    PubMed

    Silva, R; França, G S; Vilar, C S; Alcaniz, J S

    2006-02-01

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment epsilon proportional to r3. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.

  19. A Hierarchical Approach to Fracture Mechanics

    NASA Technical Reports Server (NTRS)

    Saether, Erik; Taasan, Shlomo

    2004-01-01

    Recent research conducted under NASA LaRC's Creativity and Innovation Program has led to the development of an initial approach for a hierarchical fracture mechanics. This methodology unites failure mechanisms occurring at different length scales and provides a framework for a physics-based theory of fracture. At the nanoscale, parametric molecular dynamic simulations are used to compute the energy associated with atomic level failure mechanisms. This information is used in a mesoscale percolation model of defect coalescence to obtain statistics of fracture paths and energies through Monte Carlo simulations. The mathematical structure of predicted crack paths is described using concepts of fractal geometry. The non-integer fractal dimension relates geometric and energy measures between meso- and macroscales. For illustration, a fractal-based continuum strain energy release rate is derived for inter- and transgranular fracture in polycrystalline metals.

  20. An experimental validation of a statistical-based damage detection approach.

    DOT National Transportation Integrated Search

    2011-01-01

    In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to : autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and : predicted beha...

  1. Urban pavement surface temperature. Comparison of numerical and statistical approach

    NASA Astrophysics Data System (ADS)

    Marchetti, Mario; Khalifa, Abderrahmen; Bues, Michel; Bouilloud, Ludovic; Martin, Eric; Chancibaut, Katia

    2015-04-01

    The forecast of pavement surface temperature is very specific in the context of urban winter maintenance. to manage snow plowing and salting of roads. Such forecast mainly relies on numerical models based on a description of the energy balance between the atmosphere, the buildings and the pavement, with a canyon configuration. Nevertheless, there is a specific need in the physical description and the numerical implementation of the traffic in the energy flux balance. This traffic was originally considered as a constant. Many changes were performed in a numerical model to describe as accurately as possible the traffic effects on this urban energy balance, such as tires friction, pavement-air exchange coefficient, and infrared flux neat balance. Some experiments based on infrared thermography and radiometry were then conducted to quantify the effect fo traffic on urban pavement surface. Based on meteorological data, corresponding pavement temperature forecast were calculated and were compared with fiels measurements. Results indicated a good agreement between the forecast from the numerical model based on this energy balance approach. A complementary forecast approach based on principal component analysis (PCA) and partial least-square regression (PLS) was also developed, with data from thermal mapping usng infrared radiometry. The forecast of pavement surface temperature with air temperature was obtained in the specific case of urban configurtation, and considering traffic into measurements used for the statistical analysis. A comparison between results from the numerical model based on energy balance, and PCA/PLS was then conducted, indicating the advantages and limits of each approach.

  2. Improvements in cognition, quality of life, and physical performance with clinical Pilates in multiple sclerosis: a randomized controlled trial

    PubMed Central

    Küçük, Fadime; Kara, Bilge; Poyraz, Esra Çoşkuner; İdiman, Egemen

    2016-01-01

    [Purpose] The aim of this study was to determine the effects of clinical Pilates in multiple sclerosis patients. [Subjects and Methods] Twenty multiple sclerosis patients were enrolled in this study. The participants were divided into two groups as the clinical Pilates and control groups. Cognition (Multiple Sclerosis Functional Composite), balance (Berg Balance Scale), physical performance (timed performance tests, Timed up and go test), tiredness (Modified Fatigue Impact scale), depression (Beck Depression Inventory), and quality of life (Multiple Sclerosis International Quality of Life Questionnaire) were measured before and after treatment in all participants. [Results] There were statistically significant differences in balance, timed performance, tiredness and Multiple Sclerosis Functional Composite tests between before and after treatment in the clinical Pilates group. We also found significant differences in timed performance tests, the Timed up and go test and the Multiple Sclerosis Functional Composite between before and after treatment in the control group. According to the difference analyses, there were significant differences in Multiple Sclerosis Functional Composite and Multiple Sclerosis International Quality of Life Questionnaire scores between the two groups in favor of the clinical Pilates group. There were statistically significant clinical differences in favor of the clinical Pilates group in comparison of measurements between the groups. Clinical Pilates improved cognitive functions and quality of life compared with traditional exercise. [Conclusion] In Multiple Sclerosis treatment, clinical Pilates should be used as a holistic approach by physical therapists. PMID:27134355

  3. Improvements in cognition, quality of life, and physical performance with clinical Pilates in multiple sclerosis: a randomized controlled trial.

    PubMed

    Küçük, Fadime; Kara, Bilge; Poyraz, Esra Çoşkuner; İdiman, Egemen

    2016-03-01

    [Purpose] The aim of this study was to determine the effects of clinical Pilates in multiple sclerosis patients. [Subjects and Methods] Twenty multiple sclerosis patients were enrolled in this study. The participants were divided into two groups as the clinical Pilates and control groups. Cognition (Multiple Sclerosis Functional Composite), balance (Berg Balance Scale), physical performance (timed performance tests, Timed up and go test), tiredness (Modified Fatigue Impact scale), depression (Beck Depression Inventory), and quality of life (Multiple Sclerosis International Quality of Life Questionnaire) were measured before and after treatment in all participants. [Results] There were statistically significant differences in balance, timed performance, tiredness and Multiple Sclerosis Functional Composite tests between before and after treatment in the clinical Pilates group. We also found significant differences in timed performance tests, the Timed up and go test and the Multiple Sclerosis Functional Composite between before and after treatment in the control group. According to the difference analyses, there were significant differences in Multiple Sclerosis Functional Composite and Multiple Sclerosis International Quality of Life Questionnaire scores between the two groups in favor of the clinical Pilates group. There were statistically significant clinical differences in favor of the clinical Pilates group in comparison of measurements between the groups. Clinical Pilates improved cognitive functions and quality of life compared with traditional exercise. [Conclusion] In Multiple Sclerosis treatment, clinical Pilates should be used as a holistic approach by physical therapists.

  4. Maximum entropy models of ecosystem functioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less

  5. Energy expenditure, heart rate response, and metabolic equivalents (METs) of adults taking part in children's games.

    PubMed

    Fischer, S L; Watts, P B; Jensen, R L; Nelson, J

    2004-12-01

    The needs of physical activity can be seen through the lack of numbers participating in regular physical activity as well as the increase in prevalence of certain diseases such as Type II diabetes (especially in children), cardiovascular diseases, and some cancers. With the increase in preventable diseases that are caused in part by a sedentary lifestyle, a closer look needs to be taken into the role of family interaction as a means of increasing physical activity for both adults and children. Because of the many benefits of physical activity in relation to health, a family approach to achieving recommended levels of physical activity may be quite applicable. Forty volunteers were recruited from the community (20 subjects and 20 children). The volunteers played 2 games: soccer and nerfball. Data was collected over 10 minutes (5 min per game). Expired air analysis was used to calculate energy expenditure and metabolic equivalents (METs). Descriptive statistics were calculated along with a regression analysis to determine differences between the 2 games, and an ACOVA to determine any significant effects of age, child age, gender, and physical activity level on the results. For both games, average heart rate measured approximately 88%max; average METs measured approximately 6, average energy expenditure measured approximately 40 kcal. S: This study showed that adults can achieve recommended physical activity levels through these specific activities if sustained for approximately 20 min.

  6. Modeling of carbon dioxide condensation in the high pressure flows using the statistical BGK approach

    NASA Astrophysics Data System (ADS)

    Kumar, Rakesh; Li, Zheng; Levin, Deborah A.

    2011-05-01

    In this work, we propose a new heat accommodation model to simulate freely expanding homogeneous condensation flows of gaseous carbon dioxide using a new approach, the statistical Bhatnagar-Gross-Krook method. The motivation for the present work comes from the earlier work of Li et al. [J. Phys. Chem. 114, 5276 (2010)] in which condensation models were proposed and used in the direct simulation Monte Carlo method to simulate the flow of carbon dioxide from supersonic expansions of small nozzles into near-vacuum conditions. Simulations conducted for stagnation pressures of one and three bar were compared with the measurements of gas and cluster number densities, cluster size, and carbon dioxide rotational temperature obtained by Ramos et al. [Phys. Rev. A 72, 3204 (2005)]. Due to the high computational cost of direct simulation Monte Carlo method, comparison between simulations and data could only be performed for these stagnation pressures, with good agreement obtained beyond the condensation onset point, in the farfield. As the stagnation pressure increases, the degree of condensation also increases; therefore, to improve the modeling of condensation onset, one must be able to simulate higher stagnation pressures. In simulations of an expanding flow of argon through a nozzle, Kumar et al. [AIAA J. 48, 1531 (2010)] found that the statistical Bhatnagar-Gross-Krook method provides the same accuracy as direct simulation Monte Carlo method, but, at one half of the computational cost. In this work, the statistical Bhatnagar-Gross-Krook method was modified to account for internal degrees of freedom for multi-species polyatomic gases. With the computational approach in hand, we developed and tested a new heat accommodation model for a polyatomic system to properly account for the heat release of condensation. We then developed condensation models in the framework of the statistical Bhatnagar-Gross-Krook method. Simulations were found to agree well with the experiment for all stagnation pressure cases (1-5 bar), validating the accuracy of the Bhatnagar-Gross-Krook based condensation model in capturing the physics of condensation.

  7. Characteristics of level-spacing statistics in chaotic graphene billiards.

    PubMed

    Huang, Liang; Lai, Ying-Cheng; Grebogi, Celso

    2011-03-01

    A fundamental result in nonrelativistic quantum nonlinear dynamics is that the spectral statistics of quantum systems that possess no geometric symmetry, but whose classical dynamics are chaotic, are described by those of the Gaussian orthogonal ensemble (GOE) or the Gaussian unitary ensemble (GUE), in the presence or absence of time-reversal symmetry, respectively. For massless spin-half particles such as neutrinos in relativistic quantum mechanics in a chaotic billiard, the seminal work of Berry and Mondragon established the GUE nature of the level-spacing statistics, due to the combination of the chirality of Dirac particles and the confinement, which breaks the time-reversal symmetry. A question is whether the GOE or the GUE statistics can be observed in experimentally accessible, relativistic quantum systems. We demonstrate, using graphene confinements in which the quasiparticle motions are governed by the Dirac equation in the low-energy regime, that the level-spacing statistics are persistently those of GOE random matrices. We present extensive numerical evidence obtained from the tight-binding approach and a physical explanation for the GOE statistics. We also find that the presence of a weak magnetic field switches the statistics to those of GUE. For a strong magnetic field, Landau levels become influential, causing the level-spacing distribution to deviate markedly from the random-matrix predictions. Issues addressed also include the effects of a number of realistic factors on level-spacing statistics such as next nearest-neighbor interactions, different lattice orientations, enhanced hopping energy for atoms on the boundary, and staggered potential due to graphene-substrate interactions.

  8. Physical Functioning, Physical Activity, Exercise Self-Efficacy, and Quality of Life Among Individuals With Chronic Heart Failure in Korea: A Cross-Sectional Descriptive Study.

    PubMed

    Lee, Haejung; Boo, Sunjoo; Yu, Jihyoung; Suh, Soon-Rim; Chun, Kook Jin; Kim, Jong Hyun

    2017-04-01

    Both the beneficial relationship between exercise and quality of life and the important role played by exercise self-efficacy in maintaining an exercise regimen among individuals with chronic heart failure are well known. However, most nursing interventions for Korean patients with chronic heart failure focus only on providing education related to risk factors and symptoms. Little information is available regarding the influence of physical functions, physical activity, and exercise self-efficacy on quality of life. This study was conducted to examine the impact of physical functioning, physical activity, and exercise self-efficacy on quality of life among individuals with chronic heart failure. This study used a cross-sectional descriptive design. Data were collected from 116 outpatients with chronic heart failure in Korea. Left ventricular ejection fraction and New York Heart Association classifications were chart reviewed. Information pertaining to levels of physical activity, exercise self-efficacy, and quality of life were collected using self-administered questionnaires. Data were analyzed using descriptive statistics, t tests, analyses of variance, correlations, and hierarchical multiple regressions. About 60% of participants were physically inactive, and most showed relatively low exercise self-efficacy. The mean quality-of-life score was 80.09. The significant correlates for quality of life were poverty, functional status, physical inactivity, and exercise self-efficacy. Collectively, these four variables accounted for 50% of the observed total variance in quality of life. Approaches that focus on enhancing exercise self-efficacy may improve patient-centered outcomes in those with chronic heart failure. In light of the low level of exercise self-efficacy reported and the demonstrated ability of this factor to predict quality of life, the development of effective strategies to enhance exercise self-efficacy offers a novel and effective approach to improving the quality of life of patients with chronic heart failure. Nurses should be proactive in advising patients with chronic heart failure to be more physically active and to enhance their self-confidence in diverse ways.

  9. Applications of statistical physics methods in economics: Current state and perspectives

    NASA Astrophysics Data System (ADS)

    Lux, Thomas

    2016-12-01

    This note discusses the development of applications of statistical physics to economics since the beginning of the `econophysics' movement about twenty years ago. I attempt to assess which of these applications appear particularly valuable and successful, and where important overlaps exist between research conducted by economist and `econophysicists'.

  10. Structures and Statistics of Citation Networks

    DTIC Science & Technology

    2011-05-01

    the citations among them. The papers are in the field of high- energy physics, and they were added to the online library between 1992-2003. Each paper... energy , physics:astrophysics, mathematics, computer science, statistics and many others. The value of the setSpec field can be any of these. However...the value of the categories field might contain multiple set names listed. For instance, a paper can primarily be considered as a high- energy physics

  11. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  12. Delineating ecological regions in marine systems: Integrating physical structure and community composition to inform spatial management in the eastern Bering Sea

    NASA Astrophysics Data System (ADS)

    Baker, Matthew R.; Hollowed, Anne B.

    2014-11-01

    Characterizing spatial structure and delineating meaningful spatial boundaries have useful applications to understanding regional dynamics in marine systems, and are integral to ecosystem approaches to fisheries management. Physical structure and drivers combine with biological responses and interactions to organize marine systems in unique ways at multiple scales. We apply multivariate statistical methods to define spatially coherent ecological units or ecoregions in the eastern Bering Sea. We also illustrate a practical approach to integrate data on species distribution, habitat structure and physical forcing mechanisms to distinguish areas with distinct biogeography as one means to define management units in large marine ecosystems. We use random forests to quantify the relative importance of habitat and environmental variables to the distribution of individual species, and to quantify shifts in multispecies assemblages or community composition along environmental gradients. Threshold shifts in community composition are used to identify regions with distinct physical and biological attributes, and to evaluate the relative importance of predictor variables to determining regional boundaries. Depth, bottom temperature and frontal boundaries were dominant factors delineating distinct biological communities in this system, with a latitudinal divide at approximately 60°N. Our results indicate that distinct climatic periods will shift habitat gradients and that dynamic physical variables such as temperature and stratification are important to understanding temporal stability of ecoregion boundaries. We note distinct distribution patterns among functional guilds and also evidence for resource partitioning among individual species within each guild. By integrating physical and biological data to determine spatial patterns in community composition, we partition ecosystems along ecologically significant gradients. This may provide a basis for defining spatial management units or serve as a baseline index for analyses of structural shifts in the physical environment, species abundance and distribution, and community dynamics over time.

  13. Mapping sea ice leads with a coupled numeric/symbolic system

    NASA Technical Reports Server (NTRS)

    Key, J.; Schweiger, A. J.; Maslanik, J. A.

    1990-01-01

    A method is presented which facilitates the detection and delineation of leads with single-channel Landsat data by coupling numeric and symbolic procedures. The procedure consists of three steps: (1) using the dynamic threshold method, an image is mapped to a lead/no lead binary image; (2) the likelihood of fragments to be real leads is examined with a set of numeric rules; and (3) pairs of objects are examined geometrically and merged where possible. The processing ends when all fragments are merged and statistical characteristics are determined, and a map of valid lead objects are left which summarizes useful physical in the lead complexes. Direct implementation of domain knowledge and rapid prototyping are two benefits of the rule-based system. The approach is found to be more successfully applied to mid- and high-level processing, and the system can retrieve statistics about sea-ice leads as well as detect the leads.

  14. The search for causal inferences: using propensity scores post hoc to reduce estimation error with nonexperimental research.

    PubMed

    Tumlinson, Samuel E; Sass, Daniel A; Cano, Stephanie M

    2014-03-01

    While experimental designs are regarded as the gold standard for establishing causal relationships, such designs are usually impractical owing to common methodological limitations. The objective of this article is to illustrate how propensity score matching (PSM) and using propensity scores (PS) as a covariate are viable alternatives to reduce estimation error when experimental designs cannot be implemented. To mimic common pediatric research practices, data from 140 simulated participants were used to resemble an experimental and nonexperimental design that assessed the effect of treatment status on participant weight loss for diabetes. Pretreatment participant characteristics (age, gender, physical activity, etc.) were then used to generate PS for use in the various statistical approaches. Results demonstrate how PSM and using the PS as a covariate can be used to reduce estimation error and improve statistical inferences. References for issues related to the implementation of these procedures are provided to assist researchers.

  15. Theory of the Sea Ice Thickness Distribution

    NASA Astrophysics Data System (ADS)

    Toppaladoddi, Srikanth; Wettlaufer, J. S.

    2015-10-01

    We use concepts from statistical physics to transform the original evolution equation for the sea ice thickness distribution g (h ) from Thorndike et al. into a Fokker-Planck-like conservation law. The steady solution is g (h )=N (q )hqe-h /H, where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h ≪1 , g (h ) is controlled by both thermodynamics and mechanics, whereas for h ≫1 only mechanics controls g (h ). Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h , from which we predict the observed g (h ). The genericity of our approach provides a framework for studying the geophysical-scale structure of the ice pack using methods of broad relevance in statistical mechanics.

  16. Deep data mining in a real space: Separation of intertwined electronic responses in a lightly doped BaFe 2As 2

    DOE PAGES

    Ziatdinov, Maxim; Maksov, Artem; Li, Li; ...

    2016-10-25

    Electronic interactions present in material compositions close to the superconducting dome play a key role in the manifestation of high-T c superconductivity. In many correlated electron systems, however, the parent or underdoped states exhibit strongly inhomogeneous electronic landscape at the nanoscale that may be associated with competing, coexisting, or intertwined chemical disorder, strain, magnetic, and structural order parameters. Here we demonstrate an approach based on a combination of scanning tunneling microscopy/spectroscopy and advanced statistical learning for an automatic separation and extraction of statistically significant electronic behaviors in the spin density wave regime of a lightly (~1%) gold-doped BaFe 2As 2.more » Lastly, we show that the decomposed STS spectral features have a direct relevance to fundamental physical properties of the system, such as SDW-induced gap, pseudogap-like state, and impurity resonance states.« less

  17. Deep data mining in a real space: Separation of intertwined electronic responses in a lightly doped BaFe 2As 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziatdinov, Maxim; Maksov, Artem; Li, Li

    Electronic interactions present in material compositions close to the superconducting dome play a key role in the manifestation of high-T c superconductivity. In many correlated electron systems, however, the parent or underdoped states exhibit strongly inhomogeneous electronic landscape at the nanoscale that may be associated with competing, coexisting, or intertwined chemical disorder, strain, magnetic, and structural order parameters. Here we demonstrate an approach based on a combination of scanning tunneling microscopy/spectroscopy and advanced statistical learning for an automatic separation and extraction of statistically significant electronic behaviors in the spin density wave regime of a lightly (~1%) gold-doped BaFe 2As 2.more » Lastly, we show that the decomposed STS spectral features have a direct relevance to fundamental physical properties of the system, such as SDW-induced gap, pseudogap-like state, and impurity resonance states.« less

  18. Cox process representation and inference for stochastic reaction-diffusion processes

    NASA Astrophysics Data System (ADS)

    Schnoerr, David; Grima, Ramon; Sanguinetti, Guido

    2016-05-01

    Complex behaviour in many systems arises from the stochastic interactions of spatially distributed particles or agents. Stochastic reaction-diffusion processes are widely used to model such behaviour in disciplines ranging from biology to the social sciences, yet they are notoriously difficult to simulate and calibrate to observational data. Here we use ideas from statistical physics and machine learning to provide a solution to the inverse problem of learning a stochastic reaction-diffusion process from data. Our solution relies on a non-trivial connection between stochastic reaction-diffusion processes and spatio-temporal Cox processes, a well-studied class of models from computational statistics. This connection leads to an efficient and flexible algorithm for parameter inference and model selection. Our approach shows excellent accuracy on numeric and real data examples from systems biology and epidemiology. Our work provides both insights into spatio-temporal stochastic systems, and a practical solution to a long-standing problem in computational modelling.

  19. Near-equilibrium dumb-bell-shaped figures for cohesionless small bodies

    NASA Astrophysics Data System (ADS)

    Descamps, Pascal

    2016-02-01

    In a previous paper (Descamps, P. [2015]. Icarus 245, 64-79), we developed a specific method aimed to retrieve the main physical characteristics (shape, density, surface scattering properties) of highly elongated bodies from their rotational lightcurves through the use of dumb-bell-shaped equilibrium figures. The present work is a test of this method. For that purpose we introduce near-equilibrium dumb-bell-shaped figures which are base dumb-bell equilibrium shapes modulated by lognormal statistics. Such synthetic irregular models are used to generate lightcurves from which our method is successfully applied. Shape statistical parameters of such near-equilibrium dumb-bell-shaped objects are in good agreement with those calculated for example for the Asteroid (216) Kleopatra from its dog-bone radar model. It may suggest that such bilobed and elongated asteroids can be approached by equilibrium figures perturbed be the interplay with a substantial internal friction modeled by a Gaussian random sphere.

  20. Taking Ockham's razor to enzyme dynamics and catalysis.

    PubMed

    Glowacki, David R; Harvey, Jeremy N; Mulholland, Adrian J

    2012-01-29

    The role of protein dynamics in enzyme catalysis is a matter of intense current debate. Enzyme-catalysed reactions that involve significant quantum tunnelling can give rise to experimental kinetic isotope effects with complex temperature dependences, and it has been suggested that standard statistical rate theories, such as transition-state theory, are inadequate for their explanation. Here we introduce aspects of transition-state theory relevant to the study of enzyme reactivity, taking cues from chemical kinetics and dynamics studies of small molecules in the gas phase and in solution--where breakdowns of statistical theories have received significant attention and their origins are relatively better understood. We discuss recent theoretical approaches to understanding enzyme activity and then show how experimental observations for a number of enzymes may be reproduced using a transition-state-theory framework with physically reasonable parameters. Essential to this simple model is the inclusion of multiple conformations with different reactivity.

  1. Theory of the Sea Ice Thickness Distribution.

    PubMed

    Toppaladoddi, Srikanth; Wettlaufer, J S

    2015-10-02

    We use concepts from statistical physics to transform the original evolution equation for the sea ice thickness distribution g(h) from Thorndike et al. into a Fokker-Planck-like conservation law. The steady solution is g(h)=N(q)h(q)e(-h/H), where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h≪1, g(h) is controlled by both thermodynamics and mechanics, whereas for h≫1 only mechanics controls g(h). Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h, from which we predict the observed g(h). The genericity of our approach provides a framework for studying the geophysical-scale structure of the ice pack using methods of broad relevance in statistical mechanics.

  2. Nonextensive statistical mechanics approach to electron trapping in degenerate plasmas

    NASA Astrophysics Data System (ADS)

    Mebrouk, Khireddine; Gougam, Leila Ait; Tribeche, Mouloud

    2016-06-01

    The electron trapping in a weakly nondegenerate plasma is reformulated and re-examined by incorporating the nonextensive entropy prescription. Using the q-deformed Fermi-Dirac distribution function including the quantum as well as the nonextensive statistical effects, we derive a new generalized electron density with a new contribution proportional to the electron temperature T, which may dominate the usual thermal correction (∼T2) at very low temperatures. To make the physics behind the effect of this new contribution more transparent, we analyze the modifications arising in the propagation of ion-acoustic solitary waves. Interestingly, we find that due to the nonextensive correction, our plasma model allows the possibility of existence of quantum ion-acoustic solitons with velocity higher than the Fermi ion-sound velocity. Moreover, as the nonextensive parameter q increases, the critical temperature Tc beyond which coexistence of compressive and rarefactive solitons sets in, is shifted towards higher values.

  3. A Method for Retrieving Ground Flash Fraction from Satellite Lightning Imager Data

    NASA Technical Reports Server (NTRS)

    Koshak, William J.

    2009-01-01

    A general theory for retrieving the fraction of ground flashes in N lightning observed by a satellite-based lightning imager is provided. An "exponential model" is applied as a physically reasonable constraint to describe the measured optical parameter distributions, and population statistics (i.e., mean, variance) are invoked to add additional constraints to the retrieval process. The retrieval itself is expressed in terms of a Bayesian inference, and the Maximum A Posteriori (MAP) solution is obtained. The approach is tested by performing simulated retrievals, and retrieval error statistics are provided. The ability to retrieve ground flash fraction has important benefits to the atmospheric chemistry community. For example, using the method to partition the existing satellite global lightning climatology into separate ground and cloud flash climatologies will improve estimates of lightning nitrogen oxides (NOx) production; this in turn will improve both regional air quality and global chemistry/climate model predictions.

  4. A new approach for remediation of As-contaminated soil: ball mill-based technique.

    PubMed

    Shin, Yeon-Jun; Park, Sang-Min; Yoo, Jong-Chan; Jeon, Chil-Sung; Lee, Seung-Woo; Baek, Kitae

    2016-02-01

    In this study, a physical ball mill process instead of chemical extraction using toxic chemical agents was applied to remove arsenic (As) from contaminated soil. A statistical analysis was carried out to establish the optimal conditions for ball mill processing. As a result of the statistical analysis, approximately 70% of As was removed from the soil at the following conditions: 5 min, 1.0 cm, 10 rpm, and 5% of operating time, media size, rotational velocity, and soil loading conditions, respectively. A significant amount of As remained in the grinded fine soil after ball mill processing while more than 90% of soil has the original properties to be reused or recycled. As a result, the ball mill process could remove the metals bound strongly to the surface of soil by the surface grinding, which could be applied as a pretreatment before application of chemical extraction to reduce the load.

  5. Destructive testings: dry drilling operations with TruPro system to collect samples in a powder form, from two hulls containing immobilized wastes in a hydraulic binder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pombet, Denis; Desnoyers, Yvon; Charters, Grant

    2013-07-01

    The TruPro{sup R} process enables to collect a significant number of samples to characterize radiological materials. This innovative and alternative technique is experimented for the ANDRA quality-control inspection of cemented packages. It proves to be quicker and more prolific than the current methodology. Using classical statistics and geo-statistics approaches, the physical and radiological characteristics of two hulls containing immobilized wastes (sludges or concentrates) in a hydraulic binder are assessed in this paper. The waste homogeneity is also evaluated in comparison to ANDRA criterion. Sensibility to sample size (support effect), presence of extreme values, acceptable deviation rate and minimum number ofmore » data are discussed. The final objectives are to check the homogeneity of the two characterized radwaste packages and also to validate and reinforce this alternative characterization methodology. (authors)« less

  6. Amplitude and Phase Characteristics of Signals at the Output of Spatially Separated Antennas for Paths with Scattering

    NASA Astrophysics Data System (ADS)

    Anikin, A. S.

    2018-06-01

    Conditional statistical characteristics of the phase difference are considered depending on the ratio of instantaneous output signal amplitudes of spatially separated weakly directional antennas for the normal field model for paths with radio-wave scattering. The dependences obtained are related to the physical processes on the radio-wave propagation path. The normal model parameters are established at which the statistical characteristics of the phase difference depend on the ratio of the instantaneous amplitudes and hence can be used to measure the phase difference. Using Shannon's formula, the amount of information on the phase difference of signals contained in the ratio of their amplitudes is calculated depending on the parameters of the normal field model. Approaches are suggested to reduce the shift of phase difference measured for paths with radio-wave scattering. A comparison with results of computer simulation by the Monte Carlo method is performed.

  7. Control of exciton spin statistics through spin polarization in organic optoelectronic devices

    PubMed Central

    Wang, Jianpu; Chepelianskii, Alexei; Gao, Feng; Greenham, Neil C.

    2012-01-01

    Spintronics based on organic semiconductor materials is attractive because of its rich fundamental physics and potential for device applications. Manipulating spins is obviously important for spintronics, and is usually achieved by using magnetic electrodes. Here we show a new approach where spin populations can be controlled primarily by energetics rather than kinetics. We find that exciton spin statistics can be substantially controlled by spin-polarizing carriers after injection using high magnetic fields and low temperatures, where the Zeeman energy is comparable with the thermal energy. By using this method, we demonstrate that singlet exciton formation can be suppressed by up to 53% in organic light-emitting diodes, and the dark conductance of organic photovoltaic devices can be increased by up to 45% due to enhanced formation of triplet charge-transfer states, leading to less recombination to the ground state. PMID:23149736

  8. Universality classes of fluctuation dynamics in hierarchical complex systems

    NASA Astrophysics Data System (ADS)

    Macêdo, A. M. S.; González, Iván R. Roa; Salazar, D. S. P.; Vasconcelos, G. L.

    2017-03-01

    A unified approach is proposed to describe the statistics of the short-time dynamics of multiscale complex systems. The probability density function of the relevant time series (signal) is represented as a statistical superposition of a large time-scale distribution weighted by the distribution of certain internal variables that characterize the slowly changing background. The dynamics of the background is formulated as a hierarchical stochastic model whose form is derived from simple physical constraints, which in turn restrict the dynamics to only two possible classes. The probability distributions of both the signal and the background have simple representations in terms of Meijer G functions. The two universality classes for the background dynamics manifest themselves in the signal distribution as two types of tails: power law and stretched exponential, respectively. A detailed analysis of empirical data from classical turbulence and financial markets shows excellent agreement with the theory.

  9. Computing Interactions Of Free-Space Radiation With Matter

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Cucinotta, F. A.; Shinn, J. L.; Townsend, L. W.; Badavi, F. F.; Tripathi, R. K.; Silberberg, R.; Tsao, C. H.; Badwar, G. D.

    1995-01-01

    High Charge and Energy Transport (HZETRN) computer program computationally efficient, user-friendly package of software adressing problem of transport of, and shielding against, radiation in free space. Designed as "black box" for design engineers not concerned with physics of underlying atomic and nuclear radiation processes in free-space environment, but rather primarily interested in obtaining fast and accurate dosimetric information for design and construction of modules and devices for use in free space. Computational efficiency achieved by unique algorithm based on deterministic approach to solution of Boltzmann equation rather than computationally intensive statistical Monte Carlo method. Written in FORTRAN.

  10. Model of mobile agents for sexual interactions networks

    NASA Astrophysics Data System (ADS)

    González, M. C.; Lind, P. G.; Herrmann, H. J.

    2006-02-01

    We present a novel model to simulate real social networks of complex interactions, based in a system of colliding particles (agents). The network is build by keeping track of the collisions and evolves in time with correlations which emerge due to the mobility of the agents. Therefore, statistical features are a consequence only of local collisions among its individual agents. Agent dynamics is realized by an event-driven algorithm of collisions where energy is gained as opposed to physical systems which have dissipation. The model reproduces empirical data from networks of sexual interactions, not previously obtained with other approaches.

  11. Zone clearance in an infinite TASEP with a step initial condition

    NASA Astrophysics Data System (ADS)

    Cividini, Julien; Appert-Rolland, Cécile

    2017-06-01

    The TASEP is a paradigmatic model of out-of-equilibrium statistical physics, for which many quantities have been computed, either exactly or by approximate methods. In this work we study two new kinds of observables that have some relevance in biological or traffic models. They represent the probability for a given clearance zone of the lattice to be empty (for the first time) at a given time, starting from a step density profile. Exact expressions are obtained for single-time quantities, while more involved history-dependent observables are studied by Monte Carlo simulation, and partially predicted by a phenomenological approach.

  12. Healthcare Information Systems for the epidemiologic surveillance within the community.

    PubMed

    Diomidous, Marianna; Pistolis, John; Mechili, Aggelos; Kolokathi, Aikaterini; Zimeras, Stelios

    2013-01-01

    Public health and health care are important issues for developing countries and access to health care is a significant factor that contributes to a healthy population. In response to these issues, the World Health Organization (WHO) has been working on the development of methods and models for measuring physical accessibility to health care using several layers of information integrated in a GIS. This paper describes the methodological approach for the development of a real time electronic health record, based on the statistical and geographic information for the identification of various diseases and accidents that can happen in a specific place.

  13. Collaborative Research. Damage and Burst Dynamics in Failure of Complex Geomaterials. A Statistical Physics Approach to Understanding the Complex Emergent Dynamics in Near Mean-Field Geological Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rundle, John B.; Klein, William

    We have carried out research to determine the dynamics of failure in complex geomaterials, specifically focusing on the role of defects, damage and asperities in the catastrophic failure processes (now popularly termed “Black Swan events”). We have examined fracture branching and flow processes using models for invasion percolation, focusing particularly on the dynamics of bursts in the branching process. We have achieved a fundamental understanding of the dynamics of nucleation in complex geomaterials, specifically in the presence of inhomogeneous structures.

  14. Eutrophication risk assessment in coastal embayments using simple statistical models.

    PubMed

    Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G

    2003-09-01

    A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.

  15. Modeling and Classification of Kinetic Patterns of Dynamic Metabolic Biomarkers in Physical Activity

    PubMed Central

    Breit, Marc; Netzer, Michael

    2015-01-01

    The objectives of this work were the classification of dynamic metabolic biomarker candidates and the modeling and characterization of kinetic regulatory mechanisms in human metabolism with response to external perturbations by physical activity. Longitudinal metabolic concentration data of 47 individuals from 4 different groups were examined, obtained from a cycle ergometry cohort study. In total, 110 metabolites (within the classes of acylcarnitines, amino acids, and sugars) were measured through a targeted metabolomics approach, combining tandem mass spectrometry (MS/MS) with the concept of stable isotope dilution (SID) for metabolite quantitation. Biomarker candidates were selected by combined analysis of maximum fold changes (MFCs) in concentrations and P-values resulting from statistical hypothesis testing. Characteristic kinetic signatures were identified through a mathematical modeling approach utilizing polynomial fitting. Modeled kinetic signatures were analyzed for groups with similar behavior by applying hierarchical cluster analysis. Kinetic shape templates were characterized, defining different forms of basic kinetic response patterns, such as sustained, early, late, and other forms, that can be used for metabolite classification. Acetylcarnitine (C2), showing a late response pattern and having the highest values in MFC and statistical significance, was classified as late marker and ranked as strong predictor (MFC = 1.97, P < 0.001). In the class of amino acids, highest values were shown for alanine (MFC = 1.42, P < 0.001), classified as late marker and strong predictor. Glucose yields a delayed response pattern, similar to a hockey stick function, being classified as delayed marker and ranked as moderate predictor (MFC = 1.32, P < 0.001). These findings coincide with existing knowledge on central metabolic pathways affected in exercise physiology, such as β-oxidation of fatty acids, glycolysis, and glycogenolysis. The presented modeling approach demonstrates high potential for dynamic biomarker identification and the investigation of kinetic mechanisms in disease or pharmacodynamics studies using MS data from longitudinal cohort studies. PMID:26317529

  16. Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach

    PubMed Central

    Kneifel, Joshua; Webb, David

    2016-01-01

    Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF. PMID:27956756

  17. Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach.

    PubMed

    Kneifel, Joshua; Webb, David

    2016-09-01

    Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF.

  18. Statistical mechanics of high-density bond percolation

    NASA Astrophysics Data System (ADS)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  19. Estimating structural collapse fragility of generic building typologies using expert judgment

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Perkins, David M.; Aspinall, Willy P.; Kiremidjian, Anne S.

    2014-01-01

    The structured expert elicitation process proposed by Cooke (1991), hereafter referred to as Cooke's approach, is applied for the first time in the realm of structural collapse-fragility assessment for selected generic construction types. Cooke's approach works on the principle of objective calibration scoring of judgments couple with hypothesis testing used in classical statistics. The performance-based scoring system reflects the combined measure of an expert's informativeness about variables in the problem are under consideration, and their ability to enumerate, in a statistically accurate way through expressing their true beliefs, the quantitative uncertainties associated with their assessments. We summarize the findings of an expert elicitation workshop in which a dozen earthquake-engineering professionals from around the world were engaged to estimate seismic collapse fragility for generic construction types. Development of seismic collapse fragility-functions was accomplished by combining their judgments using weights derived from Cooke's method. Although substantial effort was needed to elicit the inputs of these experts successfully, we anticipate that the elicitation strategy described here will gain momentum in a wide variety of earthquake seismology and engineering hazard and risk analyses where physical model and data limitations are inherent and objective professional judgment can fill gaps.

  20. Estimating structural collapse fragility of generic building typologies using expert judgment

    USGS Publications Warehouse

    Jaiswal, Kishor S.; Wald, D.J.; Perkins, D.; Aspinall, W.P.; Kiremidjian, Anne S.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.

    2014-01-01

    The structured expert elicitation process proposed by Cooke (1991), hereafter referred to as Cooke’s approach, is applied for the first time in the realm of structural collapse-fragility assessment for selected generic construction types. Cooke’s approach works on the principle of objective calibration scoring of judgments coupled with hypothesis testing used in classical statistics. The performance-based scoring system reflects the combined measure of an expert’s informativeness about variables in the problem area under consideration, and their ability to enumerate, in a statistically accurate way through expressing their true beliefs, the quantitative uncertainties associated with their assessments. We summarize the findings of an expert elicitation workshop in which a dozen earthquake-engineering professionals from around the world were engaged to estimate seismic collapse fragility for generic construction types. Development of seismic collapse fragility functions was accomplished by combining their judgments using weights derived from Cooke’s method. Although substantial effort was needed to elicit the inputs of these experts successfully, we anticipate that the elicitation strategy described here will gain momentum in a wide variety of earthquake seismology and engineering hazard and risk analyses where physical model and data limitations are inherent and objective professional judgment can fill gaps.

  1. Long-memory and the sea level-temperature relationship: a fractional cointegration approach.

    PubMed

    Ventosa-Santaulària, Daniel; Heres, David R; Martínez-Hernández, L Catalina

    2014-01-01

    Through thermal expansion of oceans and melting of land-based ice, global warming is very likely contributing to the sea level rise observed during the 20th century. The amount by which further increases in global average temperature could affect sea level is only known with large uncertainties due to the limited capacity of physics-based models to predict sea levels from global surface temperatures. Semi-empirical approaches have been implemented to estimate the statistical relationship between these two variables providing an alternative measure on which to base potentially disrupting impacts on coastal communities and ecosystems. However, only a few of these semi-empirical applications had addressed the spurious inference that is likely to be drawn when one nonstationary process is regressed on another. Furthermore, it has been shown that spurious effects are not eliminated by stationary processes when these possess strong long memory. Our results indicate that both global temperature and sea level indeed present the characteristics of long memory processes. Nevertheless, we find that these variables are fractionally cointegrated when sea-ice extent is incorporated as an instrumental variable for temperature which in our estimations has a statistically significant positive impact on global sea level.

  2. A model of strength

    USGS Publications Warehouse

    Johnson, Douglas H.; Cook, R.D.

    2013-01-01

    In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.

  3. The role of physical activity and diabetes status as a moderator: functional disability among older Mexican Americans.

    PubMed

    Palmer, Raymond F; Espino, David V; Dergance, Jeannae M; Becho, Johanna; Markides, Kyriakos

    2012-11-01

    we investigate the temporal association between the rate of change in physical function and the rate of change in disability across four comparison groups: Those with and without diabetes who report >30 min of physical activity per day, and those who report <30 min of physical activity per day. six waves of longitudinal data from the Hispanic Established Population for Epidemiologic Studies of the Elderly were utilised. At baseline, there were a total of 3,050 elder participants aged 65 years old or greater. The longitudinal rates of change in disability and physical function were compared by the diabetes status (ever versus none) and the physical activity status (less than or greater than or equal to 30 min per day). disability and physical function data were analysed using a latent growth curve modelling approach adjusted for relevant demographic/health-related covariates. There were statistically significant longitudinal declines in physical function and disability (P < 0.001) in all groups. Most notable, the physical activity status was an important moderator. Those with >30 min of activity demonstrated better baseline function and less disability as well as better temporal trajectories than those reporting <30 min of physical activity per day. Comparisons between diabetes statuses within the same physical activity groups showed worse disability trajectories among those with diabetes. a longitudinal decline in physical function and disability is moderated most notably by physical activity. The diabetes status further moderates decline in function and disability over time. Increased physical activity appears to be protective of disability in general and may lessen the influence of diabetes-related disability in older Mexican Americans, particularly at the end of life.

  4. A comparison of ensemble post-processing approaches that preserve correlation structures

    NASA Astrophysics Data System (ADS)

    Schefzik, Roman; Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2016-04-01

    Despite the fact that ensemble forecasts address the major sources of uncertainty, they exhibit biases and dispersion errors and therefore are known to improve by calibration or statistical post-processing. For instance the ensemble model output statistics (EMOS) method, also known as non-homogeneous regression approach (Gneiting et al., 2005) is known to strongly improve forecast skill. EMOS is based on fitting and adjusting a parametric probability density function (PDF). However, EMOS and other common post-processing approaches apply to a single weather quantity at a single location for a single look-ahead time. They are therefore unable of taking into account spatial, inter-variable and temporal dependence structures. Recently many research efforts have been invested in designing post-processing methods that resolve this drawback but also in verification methods that enable the detection of dependence structures. New verification methods are applied on two classes of post-processing methods, both generating physically coherent ensembles. A first class uses the ensemble copula coupling (ECC) that starts from EMOS but adjusts the rank structure (Schefzik et al., 2013). The second class is a member-by-member post-processing (MBM) approach that maps each raw ensemble member to a corrected one (Van Schaeybroeck and Vannitsem, 2015). We compare variants of the EMOS-ECC and MBM classes and highlight a specific theoretical connection between them. All post-processing variants are applied in the context of the ensemble system of the European Centre of Weather Forecasts (ECMWF) and compared using multivariate verification tools including the energy score, the variogram score (Scheuerer and Hamill, 2015) and the band depth rank histogram (Thorarinsdottir et al., 2015). Gneiting, Raftery, Westveld, and Goldman, 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Wea. Rev., {133}, 1098-1118. Scheuerer and Hamill, 2015. Variogram-based proper scoring rules for probabilistic forecasts of multivariate quantities. Mon. Wea. Rev. {143},1321-1334. Schefzik, Thorarinsdottir, Gneiting. Uncertainty quantification in complex simulation models using ensemble copula coupling. Statistical Science {28},616-640, 2013. Thorarinsdottir, M. Scheuerer, and C. Heinz, 2015. Assessing the calibration of high-dimensional ensemble forecasts using rank histograms, arXiv:1310.0236. Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.

  5. Estimating the Risk of Tropical Cyclone Characteristics Along the United States Gulf of Mexico Coastline Using Different Statistical Approaches

    NASA Astrophysics Data System (ADS)

    Trepanier, J. C.; Ellis, K.; Jagger, T.; Needham, H.; Yuan, J.

    2017-12-01

    Tropical cyclones, with their high wind speeds, high rainfall totals and deep storm surges, frequently strike the United States Gulf of Mexico coastline influencing millions of people and disrupting off shore economic activities. Events, such as Hurricane Katrina in 2005 and Hurricane Isaac in 2012, can be physically different but still provide detrimental effects due to their locations of influence. There are a wide variety of ways to estimate the risk of occurrence of extreme tropical cyclones. Here, the combined risk of tropical cyclone storm surge and nearshore wind speed using a statistical copula is provided for 22 Gulf of Mexico coastal cities. Of the cities considered, Bay St. Louis, Mississippi has the shortest return period for a tropical cyclone with at least a 50 m s-1 nearshore wind speed and a three meter surge (19.5 years, 17.1-23.5). Additionally, a multivariate regression model is provided estimating the compound effects of tropical cyclone tracks, landfall central pressure, the amount of accumulated precipitation, and storm surge for five locations around Lake Pontchartrain in Louisiana. It is shown the most intense tropical cyclones typically approach from the south and a small change in the amount of rainfall or landfall central pressure leads to a large change in the final storm surge depth. Data are used from the National Hurricane Center, U-Surge, SURGEDAT, and Cooperative Observer Program. The differences in the two statistical approaches are discussed, along with the advantages and limitations to each. The goal of combining the results of the two studies is to gain a better understanding of the most appropriate risk estimation technique for a given area.

  6. 14 CFR 1275.101 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., biology, engineering and physical sciences (physics and chemistry). (h) Inquiry means the assessment of..., social sciences, statistics, and biological and physical research (ground based and microgravity...

  7. Seasonal rationalization of river water quality sampling locations: a comparative study of the modified Sanders and multivariate statistical approaches.

    PubMed

    Varekar, Vikas; Karmakar, Subhankar; Jha, Ramakar

    2016-02-01

    The design of surface water quality sampling location is a crucial decision-making process for rationalization of monitoring network. The quantity, quality, and types of available dataset (watershed characteristics and water quality data) may affect the selection of appropriate design methodology. The modified Sanders approach and multivariate statistical techniques [particularly factor analysis (FA)/principal component analysis (PCA)] are well-accepted and widely used techniques for design of sampling locations. However, their performance may vary significantly with quantity, quality, and types of available dataset. In this paper, an attempt has been made to evaluate performance of these techniques by accounting the effect of seasonal variation, under a situation of limited water quality data but extensive watershed characteristics information, as continuous and consistent river water quality data is usually difficult to obtain, whereas watershed information may be made available through application of geospatial techniques. A case study of Kali River, Western Uttar Pradesh, India, is selected for the analysis. The monitoring was carried out at 16 sampling locations. The discrete and diffuse pollution loads at different sampling sites were estimated and accounted using modified Sanders approach, whereas the monitored physical and chemical water quality parameters were utilized as inputs for FA/PCA. The designed optimum number of sampling locations for monsoon and non-monsoon seasons by modified Sanders approach are eight and seven while that for FA/PCA are eleven and nine, respectively. Less variation in the number and locations of designed sampling sites were obtained by both techniques, which shows stability of results. A geospatial analysis has also been carried out to check the significance of designed sampling location with respect to river basin characteristics and land use of the study area. Both methods are equally efficient; however, modified Sanders approach outperforms FA/PCA when limited water quality and extensive watershed information is available. The available water quality dataset is limited and FA/PCA-based approach fails to identify monitoring locations with higher variation, as these multivariate statistical approaches are data-driven. The priority/hierarchy and number of sampling sites designed by modified Sanders approach are well justified by the land use practices and observed river basin characteristics of the study area.

  8. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    NASA Astrophysics Data System (ADS)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble, which is a case-matching scheme. The presentation will provide (1) an overview of each method and the experimental design, (2) performance comparisons based on standard metrics such as bias, MAE and RMSE, (3) a summary of the performance characteristics of each approach and (4) a preview of further experiments to be conducted.

  9. National transportation statistics 2011

    DOT National Transportation Integrated Search

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  10. National transportation statistics 2005

    DOT National Transportation Integrated Search

    2005-12-01

    Compiled and published by the U.S. Department of Transportations Bureau of : Transportation Statistics (BTS), National Transportation Statistics 2004 presents : information on the U.S. transportation system, including its physical components, : sa...

  11. National transportation statistics 2006

    DOT National Transportation Integrated Search

    2006-12-01

    Compiled and published by the U.S. Department of Transportations Bureau of : Transportation Statistics (BTS), National Transportation Statistics 2006 presents : information on the U.S. transportation system, including its physical components, : sa...

  12. National Transportation Statistics 2009

    DOT National Transportation Integrated Search

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  13. National transportation statistics 2004

    DOT National Transportation Integrated Search

    2005-01-01

    Compiled and published by the U.S. Department of Transportations Bureau of : Transportation Statistics (BTS), National Transportation Statistics 2004 presents : information on the U.S. transportation system, including its physical components, : sa...

  14. Transportation statistics annual report 1999

    DOT National Transportation Integrated Search

    1999-01-01

    The Bureau of Transportation Statistics (BTS) presents the sixth : Transportation Statistics Annual Report. Mandated by Congress, the report : discusses the U.S. transportation system, including its physical components, : economic performance, safety...

  15. National Transportation Statistics 2007

    DOT National Transportation Integrated Search

    2007-04-12

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  16. National Transportation Statistics 2008

    DOT National Transportation Integrated Search

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  17. Framework for Evaluating Water Quality of the New England Crystalline Rock Aquifers

    USGS Publications Warehouse

    Harte, Philip T.; Robinson, Gilpin R.; Ayotte, Joseph D.; Flanagan, Sarah M.

    2008-01-01

    Little information exists on regional ground-water-quality patterns for the New England crystalline rock aquifers (NECRA). A systematic approach to facilitate regional evaluation is needed for several reasons. First, the NECRA are vulnerable to anthropogenic and natural contaminants such as methyl tert-butyl ether (MTBE), arsenic, and radon gas. Second, the physical characteristics of the aquifers, termed 'intrinsic susceptibility', can lead to variable and degraded water quality. A framework approach for characterizing the aquifer region into areas of similar hydrogeology is described in this report and is based on hypothesized relevant physical features and chemical conditions (collectively termed 'variables') that affect regional patterns of ground-water quality. A framework for comparison of water quality across the NECRA consists of a group of spatial variables related to aquifer properties, hydrologic conditions, and contaminant sources. These spatial variables are grouped under four general categories (features) that can be mapped across the aquifers: (1) geologic, (2) hydrophysiographic, (3) land-use land-cover, and (4) geochemical. On a regional scale, these variables represent indicators of natural and anthropogenic sources of contaminants, as well as generalized physical and chemical characteristics of the aquifer system that influence ground-water chemistry and flow. These variables can be used in varying combinations (depending on the contaminant) to categorize the aquifer into areas of similar hydrogeologic characteristics to evaluate variation in regional water quality through statistical testing.

  18. Nonuniform Liouville transformers for quasi-homogeneous optical fields. Final technical report, September 25, 1989--January 22, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannson, T.

    1993-03-01

    During the last two decades, there have been dramatic improvements in the development of optical sources. Examples of this development range from semiconductor laser diodes to free electron beam lasers and synchrotron radiation. Before these developments, standards for the measurement of basic optical parameters (quantities) were less demanding. Now, however, there is a fundamental need for new, reliable methods for providing fast quantitative results for a very broad variety of optical systems and sources. This is particularly true for partially coherent optical beams, since all optical sources are either fully or partially spatially coherent (including Lambertian sources). Until now, theremore » has been no satisfactory solution to this problem. During the last two decades, however, the foundations of physical radiometry have been developed by Walther, Wolf and co-workers. By integrating physical optics, statistical optics and conventional radiometry, this body of work provides necessary tools for the evaluation of radiometric quantities for partially coherent optical beams propagating through optical systems. In this program, Physical Optics Corporation (POC) demonstrated the viability of such a radiometric approach for the specific case of generalized energy concentrators called Liouville transformers. We believe that this radiometric approach is necessary to fully characterize any type of optical system since it takes into account the partial coherence of radiation. 90 refs., 57 figs., 4 tabs.« less

  19. Spacewatch Survey of the Solar System

    NASA Technical Reports Server (NTRS)

    McMillan, Robert S.

    2000-01-01

    The purpose of the Spacewatch project is to explore the various populations of small objects throughout the solar system. Statistics on all classes of small bodies are needed to infer their physical and dynamical evolution. More Earth Approachers need to be found to assess the impact hazard. (We have adopted the term "Earth Approacher", EA, to include all those asteroids, nuclei of extinct short period comets, and short period comets that can approach close to Earth. The adjective "near" carries potential confusion, as we have found in communicating with the media, that the objects are always near Earth, following it like a cloud.) Persistent and voluminous accumulation of astrometry of incidentally observed main belt asteroids MBAs will eventually permit the Minor Planet Center (MPQ to determine the orbits of large numbers (tens of thousands) of asteroids. Such a large body of information will ultimately allow better resolution of orbit classes and the determinations of luminosity functions of the various classes, Comet and asteroid recoveries are essential services to planetary astronomy. Statistics of objects in the outer solar system (Centaurs, scattered-disk objects, and Trans-Neptunian Objects; TNOs) ultimately will tell part of the story of solar system evolution. Spacewatch led the development of sky surveying by electronic means and has acted as a responsible interface to the media and general public on this discipline and on the issue of the hazard from impacts by asteroids and comets.

  20. Calculation of the detection limit in radiation measurements with systematic uncertainties

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, J. M.; Russ, W.; Venkataraman, R.; Young, B. M.

    2015-06-01

    The detection limit (LD) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case.

Top