The contribution of statistical physics to evolutionary biology.
de Vladar, Harold P; Barton, Nicholas H
2011-08-01
Evolutionary biology shares many concepts with statistical physics: both deal with populations, whether of molecules or organisms, and both seek to simplify evolution in very many dimensions. Often, methodologies have undergone parallel and independent development, as with stochastic methods in population genetics. Here, we discuss aspects of population genetics that have embraced methods from physics: non-equilibrium statistical mechanics, travelling waves and Monte-Carlo methods, among others, have been used to study polygenic evolution, rates of adaptation and range expansions. These applications indicate that evolutionary biology can further benefit from interactions with other areas of statistical physics; for example, by following the distribution of paths taken by a population through time. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel
2017-07-01
Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.
Physics-based statistical model and simulation method of RF propagation in urban environments
Pao, Hsueh-Yuan; Dvorak, Steven L.
2010-09-14
A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.
Forecasting runout of rock and debris avalanches
Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.
2006-01-01
Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
A new universality class in corpus of texts; A statistical physics study
NASA Astrophysics Data System (ADS)
Najafi, Elham; Darooneh, Amir H.
2018-05-01
Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.
Statistical physics of hard combinatorial optimization: Vertex cover problem
NASA Astrophysics Data System (ADS)
Zhao, Jin-Hua; Zhou, Hai-Jun
2014-07-01
Typical-case computation complexity is a research topic at the boundary of computer science, applied mathematics, and statistical physics. In the last twenty years, the replica-symmetry-breaking mean field theory of spin glasses and the associated message-passing algorithms have greatly deepened our understanding of typical-case computation complexity. In this paper, we use the vertex cover problem, a basic nondeterministic-polynomial (NP)-complete combinatorial optimization problem of wide application, as an example to introduce the statistical physical methods and algorithms. We do not go into the technical details but emphasize mainly the intuitive physical meanings of the message-passing equations. A nonfamiliar reader shall be able to understand to a large extent the physics behind the mean field approaches and to adjust the mean field methods in solving other optimization problems.
ERIC Educational Resources Information Center
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
Reconstructing Macroeconomics Based on Statistical Physics
NASA Astrophysics Data System (ADS)
Aoki, Masanao; Yoshikawa, Hiroshi
We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.
A pedagogical derivation of the matrix element method in particle physics data analysis
NASA Astrophysics Data System (ADS)
Sumowidagdo, Suharyo
2018-03-01
The matrix element method provides a direct connection between the underlying theory of particle physics processes and detector-level physical observables. I am presenting a pedagogically-oriented derivation of the matrix element method, drawing from elementary concepts in probability theory, statistics, and the process of experimental measurements. The level of treatment should be suitable for beginning research student in phenomenology and experimental high energy physics.
Applications of statistical physics methods in economics: Current state and perspectives
NASA Astrophysics Data System (ADS)
Lux, Thomas
2016-12-01
This note discusses the development of applications of statistical physics to economics since the beginning of the `econophysics' movement about twenty years ago. I attempt to assess which of these applications appear particularly valuable and successful, and where important overlaps exist between research conducted by economist and `econophysicists'.
The effects of modeling instruction on high school physics academic achievement
NASA Astrophysics Data System (ADS)
Wright, Tiffanie L.
The purpose of this study was to explore whether Modeling Instruction, compared to traditional lecturing, is an effective instructional method to promote academic achievement in selected high school physics classes at a rural middle Tennessee high school. This study used an ex post facto , quasi-experimental research methodology. The independent variables in this study were the instructional methods of teaching. The treatment variable was Modeling Instruction and the control variable was traditional lecture instruction. The Treatment Group consisted of participants in Physical World Concepts who received Modeling Instruction. The Control Group consisted of participants in Physical Science who received traditional lecture instruction. The dependent variable was gains scores on the Force Concepts Inventory (FCI). The participants for this study were 133 students each in both the Treatment and Control Groups (n = 266), who attended a public, high school in rural middle Tennessee. The participants were administered the Force Concepts Inventory (FCI) prior to being taught the mechanics of physics. The FCI data were entered into the computer-based Statistical Package for the Social Science (SPSS). Two independent samples t-tests were conducted to answer the research questions. There was a statistically significant difference between the treatment and control groups concerning the instructional method. Modeling Instructional methods were found to be effective in increasing the academic achievement of students in high school physics. There was no statistically significant difference between FCI gains scores for gender. Gender was found to have no effect on the academic achievement of students in high school physics classes. However, even though there was not a statistically significant difference, female students' gains scores were higher than male students' gains scores when Modeling Instructional methods of teaching were used. Based on these findings, it is recommended that high school science teachers should use Modeling Instructional methods of teaching daily in their classrooms. A recommendation for further research is to expand the Modeling Instructional methods of teaching into different content areas, (i.e., reading and language arts) to explore academic achievement gains.
Networking—a statistical physics perspective
NASA Astrophysics Data System (ADS)
Yeung, Chi Ho; Saad, David
2013-03-01
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.
Some past and present challenges of econophysics
NASA Astrophysics Data System (ADS)
Mantegna, R. N.
2016-12-01
We discuss the cultural background that was shared by some of the first econophysicists when they started to work on economic and financial problems with methods and tools of statistical physics. In particular we discuss about the role of stylized facts and statistical physical laws in economics and statistical physics respectively. As an example of the problems and potentials associated with the interaction of different communities of scholars dealing with problems observed in economic and financial systems we briefly discuss the development and the perspectives of the use of tools and concepts of networks in econophysics, economics and finance.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
NASA Astrophysics Data System (ADS)
Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.
2018-01-01
We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.
Hart, Carl R; Reznicek, Nathan J; Wilson, D Keith; Pettit, Chris L; Nykaza, Edward T
2016-05-01
Many outdoor sound propagation models exist, ranging from highly complex physics-based simulations to simplified engineering calculations, and more recently, highly flexible statistical learning methods. Several engineering and statistical learning models are evaluated by using a particular physics-based model, namely, a Crank-Nicholson parabolic equation (CNPE), as a benchmark. Narrowband transmission loss values predicted with the CNPE, based upon a simulated data set of meteorological, boundary, and source conditions, act as simulated observations. In the simulated data set sound propagation conditions span from downward refracting to upward refracting, for acoustically hard and soft boundaries, and low frequencies. Engineering models used in the comparisons include the ISO 9613-2 method, Harmonoise, and Nord2000 propagation models. Statistical learning methods used in the comparisons include bagged decision tree regression, random forest regression, boosting regression, and artificial neural network models. Computed skill scores are relative to sound propagation in a homogeneous atmosphere over a rigid ground. Overall skill scores for the engineering noise models are 0.6%, -7.1%, and 83.8% for the ISO 9613-2, Harmonoise, and Nord2000 models, respectively. Overall skill scores for the statistical learning models are 99.5%, 99.5%, 99.6%, and 99.6% for bagged decision tree, random forest, boosting, and artificial neural network regression models, respectively.
BAYESIAN ESTIMATION OF THERMONUCLEAR REACTION RATES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliadis, C.; Anderson, K. S.; Coc, A.
The problem of estimating non-resonant astrophysical S -factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We presentmore » astrophysical S -factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p, γ ){sup 3}He, {sup 3}He({sup 3}He,2p){sup 4}He, and {sup 3}He( α , γ ){sup 7}Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.« less
Measurement in Physical Education. 5th Edition.
ERIC Educational Resources Information Center
Mathews, Donald K.
Concepts of measurement in physical education are presented in this college-level text to enable the preservice physical education major to develop skills in determining pupil status, designing effective physical activity programs, and measuring student progress. Emphasis is placed upon discussion of essential statistical methods, test…
Theory of atomic spectral emission intensity
NASA Astrophysics Data System (ADS)
Yngström, Sten
1994-07-01
The theoretical derivation of a new spectral line intensity formula for atomic radiative emission is presented. The theory is based on first principles of quantum physics, electrodynamics, and statistical physics. Quantum rules lead to revision of the conventional principle of local thermal equilibrium of matter and radiation. Study of electrodynamics suggests absence of spectral emission from fractions of the numbers of atoms and ions in a plasma due to radiative inhibition caused by electromagnetic force fields. Statistical probability methods are extended by the statement: A macroscopic physical system develops in the most probable of all conceivable ways consistent with the constraining conditions for the system. The crucial role of statistical physics in transforming quantum logic into common sense logic is stressed. The theory is strongly supported by experimental evidence.
NASA Astrophysics Data System (ADS)
Suhir, E.
2014-05-01
The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.
Assaraf, Roland
2014-12-01
We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
NASA Astrophysics Data System (ADS)
Ben Torkia, Yosra; Ben Yahia, Manel; Khalfaoui, Mohamed; Al-Muhtaseb, Shaheen A.; Ben Lamine, Abdelmottaleb
2014-01-01
The adsorption energy distribution (AED) function of a commercial activated carbon (BDH-activated carbon) was investigated. For this purpose, the integral equation is derived by using a purely analytical statistical physics treatment. The description of the heterogeneity of the adsorbent is significantly clarified by defining the parameter N(E). This parameter represents the energetic density of the spatial density of the effectively occupied sites. To solve the integral equation, a numerical method was used based on an adequate algorithm. The Langmuir model was adopted as a local adsorption isotherm. This model is developed by using the grand canonical ensemble, which allows defining the physico-chemical parameters involved in the adsorption process. The AED function is estimated by a normal Gaussian function. This method is applied to the adsorption isotherms of nitrogen, methane and ethane at different temperatures. The development of the AED using a statistical physics treatment provides an explanation of the gas molecules behaviour during the adsorption process and gives new physical interpretations at microscopic levels.
NASA Astrophysics Data System (ADS)
Mazzitello, Karina I.; Candia, Julián
2012-12-01
In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Learning physics: A comparative analysis between instructional design methods
NASA Astrophysics Data System (ADS)
Mathew, Easow
The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in the control group which may indicate that traditional teaching methods are more effective in females, whereas there was no significant difference noted in the experimental group between male and female participants. There was a statistically significant and negative relationship (r = -.61, p = .01) between age and physics pretest scores in the control group. No statistical analyses yielded significantly different average academic performance values in either group as delineated by ethnicity.
Statistically Modeling I-V Characteristics of CNT-FET with LASSO
NASA Astrophysics Data System (ADS)
Ma, Dongsheng; Ye, Zuochang; Wang, Yan
2017-08-01
With the advent of internet of things (IOT), the need for studying new material and devices for various applications is increasing. Traditionally we build compact models for transistors on the basis of physics. But physical models are expensive and need a very long time to adjust for non-ideal effects. As the vision for the application of many novel devices is not certain or the manufacture process is not mature, deriving generalized accurate physical models for such devices is very strenuous, whereas statistical modeling is becoming a potential method because of its data oriented property and fast implementation. In this paper, one classical statistical regression method, LASSO, is used to model the I-V characteristics of CNT-FET and a pseudo-PMOS inverter simulation based on the trained model is implemented in Cadence. The normalized relative mean square prediction error of the trained model versus experiment sample data and the simulation results show that the model is acceptable for digital circuit static simulation. And such modeling methodology can extend to general devices.
Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki
2018-04-01
Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.
PREFACE: Advanced many-body and statistical methods in mesoscopic systems
NASA Astrophysics Data System (ADS)
Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe
2012-02-01
It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius University (where the conference took place), the Academy of Romanian Scientists and the Romanian National Authority for Scientific Research. This conference proceedings volume brings together some of the invited and contributed talks of the conference. The hope of the editors is that they will constitute reference material for applying many-body techniques to problems in mesoscopic and nuclear physics. We thank all the participants for their contribution to the success of this conference. D V Anghel and D S Delion IFIN-HH, Bucharest, Romania G S Paraoanu Aalto University, Finland Conference photograph
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-06-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-01-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370
Micro-foundations for macroeconomics: New set-up based on statistical physics
NASA Astrophysics Data System (ADS)
Yoshikawa, Hiroshi
2016-12-01
Modern macroeconomics is built on "micro foundations." Namely, optimization of micro agent such as consumer and firm is explicitly analyzed in model. Toward this goal, standard model presumes "the representative" consumer/firm, and analyzes its behavior in detail. However, the macroeconomy consists of 107 consumers and 106 firms. For the purpose of analyzing such macro system, it is meaningless to pursue the micro behavior in detail. In this respect, there is no essential difference between economics and physics. The method of statistical physics can be usefully applied to the macroeconomy, and provides Keynesian economics with correct micro-foundations.
NASA Astrophysics Data System (ADS)
Kahn, Yoni; Anderson, Adam
2018-03-01
Preface; How to use this book; Resources; 1. Classical mechanics; 2. Electricity and magnetism; 3. Optics and waves; 4. Thermodynamics and statistical mechanics; 5. Quantum mechanics and atomic physics; 6. Special relativity; 7. Laboratory methods; 8. Specialized topics; 9. Special tips and tricks for the Physics GRE; Sample exams and solutions; References; Equation index; Subject index; Problems index.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
NASA Astrophysics Data System (ADS)
Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang
2017-10-01
Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.
Application of econometric and ecology analysis methods in physics software
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo
2017-10-01
Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.
The applications of statistical quantification techniques in nanomechanics and nanoelectronics.
Mai, Wenjie; Deng, Xinwei
2010-10-08
Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f(0). The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 +/- 0.0274 x 10( - 5) ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.
NASA Astrophysics Data System (ADS)
Beach, Shaun E.; Semkow, Thomas M.; Remling, David J.; Bradt, Clayton J.
2017-07-01
We have developed accessible methods to demonstrate fundamental statistics in several phenomena, in the context of teaching electronic signal processing in a physics-based college-level curriculum. A relationship between the exponential time-interval distribution and Poisson counting distribution for a Markov process with constant rate is derived in a novel way and demonstrated using nuclear counting. Negative binomial statistics is demonstrated as a model for overdispersion and justified by the effect of electronic noise in nuclear counting. The statistics of digital packets on a computer network are shown to be compatible with the fractal-point stochastic process leading to a power-law as well as generalized inverse Gaussian density distributions of time intervals between packets.
Applications of physical methods in high-frequency futures markets
NASA Astrophysics Data System (ADS)
Bartolozzi, M.; Mellen, C.; Chan, F.; Oliver, D.; Di Matteo, T.; Aste, T.
2007-12-01
In the present work we demonstrate the application of different physical methods to high-frequency or tick-bytick financial time series data. In particular, we calculate the Hurst exponent and inverse statistics for the price time series taken from a range of futures indices. Additionally, we show that in a limit order book the relaxation times of an imbalanced book state with more demand or supply can be described by stretched exponential laws analogous to those seen in many physical systems.
Methods to Measure Physical Activity Behaviors in Health Education Research
ERIC Educational Resources Information Center
Fitzhugh, Eugene C.
2015-01-01
Regular physical activity (PA) is an important concept to measure in health education research. The health education researcher might need to measure physical activity because it is the primary measure of interest, or PA might be a confounding measure that needs to be controlled for in statistical analysis. The purpose of this commentary is to…
NASA Astrophysics Data System (ADS)
Nearing, G. S.
2014-12-01
Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.
Estimating trends in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, Andrew; Moyer, Elisabeth J.; Stein, Michael L.
2017-06-01
Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the important characteristics of internal variability, can result in more accurate uncertainty statements about trends.
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
Robust Statistical Detection of Power-Law Cross-Correlation.
Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert
2016-06-02
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-01-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630
Hidden Statistics of Schroedinger Equation
NASA Technical Reports Server (NTRS)
Zak, Michail
2011-01-01
Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.
Reentry survivability modeling
NASA Astrophysics Data System (ADS)
Fudge, Michael L.; Maher, Robert L.
1997-10-01
Statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by re-entering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was recently demonstrated in dramatic fashion by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This paper examines reentry survivability estimation methodology, including the specific methodology used by Caiman Sciences' 'Survive' model. Comparison between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and Survive estimates are presented for selected upper stage or spacecraft components and a Delta launch vehicle second stage.
Automated sampling assessment for molecular simulations using the effective sample size
Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.
2010-01-01
To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418
NASA Astrophysics Data System (ADS)
Usowicz, Boguslaw; Marczewski, Wojciech; Usowicz, Jerzy B.; Łukowski, Mateusz; Lipiec, Jerzy; Stankiewicz, Krystyna
2013-04-01
Radiometric observations with SMOS rely on the Radiation Transfer Equations (RTE) determining the Brightness Temperature (BT) in two linear polarization components (H, V) satisfying Fresnel principle of propagation in horizontally layered target media on the ground. RTE involve variables which bound the equations expressed in Electro-Magnetic (EM) terms of the intensity BT to the physical reality expressed by non-EM variables (Soil Moisture (SM), vegetation indexes, fractional coverage with many different properties, and the boundary conditions like optical thickness, layer definitions, roughness, etc.) bridging the EM domain to other physical aspects by means of the so called tau-omega methods. This method enables joining variety of different valuable models, including specific empirical estimation of physical properties in relation to the volumetric water content. The equations of RTE are in fact expressed by propagation, reflection and losses or attenuation existing on a considered propagation path. The electromagnetic propagation is expressed in the propagation constant. For target media on the ground the dielectric constant is a decisive part for effects of propagation. Therefore, despite of many various physical parameters involved, one must effectively and dominantly rely on the dielectric constant meant as a complex variable. The real part of the dielectric constant represents effect of apparent shortening the propagation path and the refraction, while the imaginary part is responsible for the attenuation or losses. This work engages statistical-physical modeling of soil properties considering the media as a mixture of solid grains, and gas or liquid filling of pores and contact bridges between compounds treated statistically. The method of this modeling provides an opportunity of characterizing the porosity by general statistical means, and is applicable to various physical properties (thermal, electrical conductivity and dielectric properties) which depend on composition of compounds. The method was developed beyond the SMOS method, but they meet just in RTE, at the dielectric constant. The dielectric constant is observed or measured (retrieved) by SMOS, regardless other properties like the soil porosity and without a direct relation to thermal properties of soils. Relations between thermal properties of soil to the water content are very consistent. Therefore, we took a concept of introducing effects of the soil porosity, and thermal properties of soils into the representation of the dielectric constant in complex measures, and thus gaining new abilities for capturing effects of the porosity by the method of SMOS observations. Currently we are able presenting few effects of relations between thermal properties and the soil moisture content, on examples from wetlands Biebrza and Polesie in Poland, and only search for correlations between SM from SMOS to the moisture content known from the ground. The correlations are poor for SMOS L2 data processed with the version of retrievals using the model of Dobson (501), but we expect more correlation for the version using the model of Mironov (551). If the supposition is confirmed, then we may gain encouragement to employing the statistical-physical modeling of the dielectric constant and thermal properties for the purposes of using this model in RTE and tau-omega method. Treating the soil porosity for a target of research directly is not enough strongly motivated like the use of effects on SM observable in SMOS.
Consistency of extreme flood estimation approaches
NASA Astrophysics Data System (ADS)
Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf
2017-04-01
Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.
Trends and associated uncertainty in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, A. N.; Moyer, E. J.; Stein, M.
2016-12-01
Physical models suggest that the Earth's mean temperature warms in response to changing CO2 concentrations (and hence increased radiative forcing); given physical uncertainties in this relationship, the historical temperature record is a source of empirical information about global warming. A persistent thread in many analyses of the historical temperature record, however, is the reliance on methods that appear to deemphasize both physical and statistical assumptions. Examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for natural variability in nonparametric rather than parametric ways. We show here that methods that deemphasize assumptions can limit the scope of analysis and can lead to misleading inferences, particularly in the setting considered where the data record is relatively short and the scale of temporal correlation is relatively long. A proposed model that is simple but physically informed provides a more reliable estimate of trends and allows a broader array of questions to be addressed. In accounting for uncertainty, we also illustrate how parametric statistical models that are attuned to the important characteristics of natural variability can be more reliable than ostensibly more flexible approaches.
Items New to the Collection - Betty Petersen Memorial Library
Symbolic-numeric Methods. Springer Verlag. Ambaum MHP. 2010. Thermal physics of the atmosphere. Hoboken ; Boston, Mass.: American Meteorological Society. Tarantola A. 1987. Inverse Problem Theory Methods for Wiley & Sons. Wilks DS. 2010. Statistical methods in the atmospheric sciences. Amsterdam: Elsevier
Robust Strategy for Rocket Engine Health Monitoring
NASA Technical Reports Server (NTRS)
Santi, L. Michael
2001-01-01
Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.
Linhart, S. Mike; Nania, Jon F.; Christiansen, Daniel E.; Hutchinson, Kasey J.; Sanders, Curtis L.; Archfield, Stacey A.
2013-01-01
A variety of individuals from water resource managers to recreational users need streamflow information for planning and decisionmaking at locations where there are no streamgages. To address this problem, two statistically based methods, the Flow Duration Curve Transfer method and the Flow Anywhere method, were developed for statewide application and the two physically based models, the Precipitation Runoff Modeling-System and the Soil and Water Assessment Tool, were only developed for application for the Cedar River Basin. Observed and estimated streamflows for the two methods and models were compared for goodness of fit at 13 streamgages modeled in the Cedar River Basin by using the Nash-Sutcliffe and the percent-bias efficiency values. Based on median and mean Nash-Sutcliffe values for the 13 streamgages the Precipitation Runoff Modeling-System and Soil and Water Assessment Tool models appear to have performed similarly and better than Flow Duration Curve Transfer and Flow Anywhere methods. Based on median and mean percent bias values, the Soil and Water Assessment Tool model appears to have generally overestimated daily mean streamflows, whereas the Precipitation Runoff Modeling-System model and statistical methods appear to have underestimated daily mean streamflows. The Flow Duration Curve Transfer method produced the lowest median and mean percent bias values and appears to perform better than the other models.
A Complex Network Approach to Stylometry
Amancio, Diego Raphael
2015-01-01
Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921
Statistics, gymnastics and the origins of sport science in Belgium (and Europe).
Delheye, Pascal
2014-01-01
This paper analyses the introduction of statistics in the field of gymnastics and its effect on the institutionalisation of physical education as a fully fledged academic discipline. Soon after Belgian independence, Adolphe Quetelet's research already resulted in large-scale anthropometric statistics - indeed, he developed an index that is still being used and is better known under the name of the body mass index. His insights were applied by promoters of gymnastics who wanted to make physical education more scientific. Thus, Clément Lefébure, director of the Ecole Normale de Gymnastique et d'Escrime in Brussels, set up a comparative experiment (with pre- and post-test measurements) by which he intended to show that the 'rational' method of Swedish gymnastics produced much better results than the 'empirical' method of Belgian/German Turnen. Lefébure's experiment, which was cited internationally but which was also strongly contested by opponents, was one of the factors that led to Swedish gymnastics being officially institutionalised in 1908 at the newly founded Higher Institute of Physical Education of the State University of Ghent, the first institute in the world where students could obtain a doctoral degree in physical education. Although it rested actually on very weak scientific foundations, the bastion of Swedish gymnastics built in Belgium in that pre-war period collapsed only in the 1960s. From then on, sport science could develop fully within the institutes for physical education.
A New Approach to Monte Carlo Simulations in Statistical Physics
NASA Astrophysics Data System (ADS)
Landau, David P.
2002-08-01
Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).
NASA Astrophysics Data System (ADS)
Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav
2015-03-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.
Mothers' physical interventions in toddler play in a low-income, African American sample.
Ispa, Jean M; Claire Cook, J; Harmeyer, Erin; Rudy, Duane
2015-11-01
This mixed method study examined 28 low-income African American mothers' physical interventions in their 14-month-old toddlers' play. Inductive methods were used to identify six physical intervention behaviors, the affect accompanying physical interventions, and apparent reasons for intervening. Nonparametric statistical analyses determined that toddlers experienced physical intervention largely in the context of positive maternal affect. Mothers of boys expressed highly positive affect while physically intervening more than mothers of girls. Most physically intervening acts seemed to be motivated by maternal intent to show or tell children how to play or to correct play deemed incorrect. Neutral affect was the most common toddler affect type following physical intervention, but boys were more likely than girls to be upset immediately after physical interventions. Physical interventions intended to protect health and safety seemed the least likely to elicit toddler upset. Copyright © 2015 Elsevier Inc. All rights reserved.
Precision Cosmology: The First Half Million Years
NASA Astrophysics Data System (ADS)
Jones, Bernard J. T.
2017-06-01
Cosmology seeks to characterise our Universe in terms of models based on well-understood and tested physics. Today we know our Universe with a precision that once would have been unthinkable. This book develops the entire mathematical, physical and statistical framework within which this has been achieved. It tells the story of how we arrive at our profound conclusions, starting from the early twentieth century and following developments up to the latest data analysis of big astronomical datasets. It provides an enlightening description of the mathematical, physical and statistical basis for understanding and interpreting the results of key space- and ground-based data. Subjects covered include general relativity, cosmological models, the inhomogeneous Universe, physics of the cosmic background radiation, and methods and results of data analysis. Extensive online supplementary notes, exercises, teaching materials, and exercises in Python make this the perfect companion for researchers, teachers and students in physics, mathematics, and astrophysics.
Dorfman, Kevin D
2018-02-01
The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.
Effect of different mixing methods on the physical properties of Portland cement.
Shahi, Shahriar; Ghasemi, Negin; Rahimi, Saeed; Yavari, Hamidreza; Samiei, Mohammad; Jafari, Farnaz
2016-12-01
The Portland cement is hydrophilic cement; as a result, the powder-to-liquid ratio affects the properties of the final mix. In addition, the mixing technique affects hydration. The aim of this study was to evaluate the effect of different mixing techniques (conventional, amalgamator and ultrasonic) on some selective physical properties of Portland cement. The physical properties to be evaluated were determined using the ISO 6786:2001 specification. One hundred sixty two samples of Portland cement were prepared for three mixing techniques for each physical property (each 6 samples). Data were analyzed using descriptive statistics, one-way ANOVA and post hoc Tukey tests. Statistical significance was set at P <0.05. The mixing technique had no significant effect on the compressive strength, film thickness and flow of Portland cement ( P >0.05). Dimensional changes (shrinkage), solubility and pH increased significantly by amalgamator and ultrasonic mixing techniques ( P <0.05). The ultrasonic technique significantly decreased working time, and the amalgamator and ultrasonic techniques significantly decreased the setting time ( P <0.05). The mixing technique exerted no significant effect on the flow, film thickness and compressive strength of Portland cement samples. Key words: Physical properties, Portland cement, mixing methods.
RooStatsCms: A tool for analysis modelling, combination and statistical studies
NASA Astrophysics Data System (ADS)
Piparo, D.; Schott, G.; Quast, G.
2010-04-01
RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.
ERIC Educational Resources Information Center
Cardina, Catherine E.; DeNysschen, Carol
2018-01-01
Purpose: This study described professional development (PD) among public school physical education (PE) teachers and compared PE teachers to teachers of other subjects. Method: Data were collected from a nationally representative sample of public school teachers in the United States. Descriptive statistics were used to describe teachers' support…
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)
2001-01-01
Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.
The mean time-limited crash rate of stock price
NASA Astrophysics Data System (ADS)
Li, Yun-Xian; Li, Jiang-Cheng; Yang, Ai-Jun; Tang, Nian-Sheng
2017-05-01
In this article we investigate the occurrence of stock market crash in an economy cycle. Bayesian approach, Heston model and statistical-physical method are considered. Specifically, Heston model and an effective potential are employed to address the dynamic changes of stock price. Bayesian approach has been utilized to estimate the Heston model's unknown parameters. Statistical physical method is used to investigate the occurrence of stock market crash by calculating the mean time-limited crash rate. The real financial data from the Shanghai Composite Index is analyzed with the proposed methods. The mean time-limited crash rate of stock price is used to describe the occurrence of stock market crash in an economy cycle. The monotonous and nonmonotonous behaviors are observed in the behavior of the mean time-limited crash rate versus volatility of stock for various cross correlation coefficient between volatility and price. Also a minimum occurrence of stock market crash matching an optimal volatility is discovered.
Effect of physical activity on musculoskeletal discomforts among handicraft workers
Shakerian, Mahnaz; Rismanchian, Masoud; Khalili, Pejman; Torki, Akram
2016-01-01
Introduction: Handicrafts seems to be one of the high-risk jobs regarding work-related musculoskeletal disorders (WMSDs) which necessitate the implementation of different corrective intervention like regular physical activities. This study aimed to investigate the impact of physical activity on WMSDs among craftsmen. Methods: This cross-sectional study was an analytical – descriptive study carried out on 100 craftsmen working in Isfahan, Iran, in 2013. The sampling method was census, and all workshops involved with this job were included. Information on demographic parameters and physical activity was collected by demographic forms. The data related to worker's musculoskeletal discomforts were conducted using Cornell Musculoskeletal Discomfort Questionnaire. The data were analyzed using statistical tests including independent t-test, Chi-square, and ANOVA. The statistical analysis was performed using SPSS 18. Results: The highest percentages of complaints related to severe musculoskeletal discomfort were reported in right shoulder (%36), right wrist (%26), neck (%25), and upper right arm (%24), respectively. A significant relationship was observed between physical activity and musculoskeletal discomforts of left wrist (P = 0.012), lower back (P = 0.016), and neck (P = 0.006). Discussion and Conclusion: Based on the study results, it can be inferred that regular but not too heavy physical activity can have a positive impact on decreasing the musculoskeletal discomforts. PMID:27512700
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Learning physical descriptors for materials science by compressed sensing
NASA Astrophysics Data System (ADS)
Ghiringhelli, Luca M.; Vybiral, Jan; Ahmetcik, Emre; Ouyang, Runhai; Levchenko, Sergey V.; Draxl, Claudia; Scheffler, Matthias
2017-02-01
The availability of big data in materials science offers new routes for analyzing materials properties and functions and achieving scientific understanding. Finding structure in these data that is not directly visible by standard tools and exploitation of the scientific information requires new and dedicated methodology based on approaches from statistical learning, compressed sensing, and other recent methods from applied mathematics, computer science, statistics, signal processing, and information science. In this paper, we explain and demonstrate a compressed-sensing based methodology for feature selection, specifically for discovering physical descriptors, i.e., physical parameters that describe the material and its properties of interest, and associated equations that explicitly and quantitatively describe those relevant properties. As showcase application and proof of concept, we describe how to build a physical model for the quantitative prediction of the crystal structure of binary compound semiconductors.
Asymptotic formulae for likelihood-based tests of new physics
NASA Astrophysics Data System (ADS)
Cowan, Glen; Cranmer, Kyle; Gross, Eilam; Vitells, Ofer
2011-02-01
We describe likelihood-based statistical tests for use in high energy physics for the discovery of new phenomena and for construction of confidence intervals on model parameters. We focus on the properties of the test procedures that allow one to account for systematic uncertainties. Explicit formulae for the asymptotic distributions of test statistics are derived using results of Wilks and Wald. We motivate and justify the use of a representative data set, called the "Asimov data set", which provides a simple method to obtain the median experimental sensitivity of a search or measurement as well as fluctuations about this expectation.
An Analysis Methodology for the Gamma-ray Large Area Space Telescope
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Cohen-Tanugi, Johann
2004-01-01
The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.
NASA Astrophysics Data System (ADS)
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Measured time-series of both precipitation and runoff are known to exhibit highly non-trivial statistical properties. For making reliable probabilistic predictions in hydrology, it is therefore desirable to have stochastic models with output distributions that share these properties. When parameters of such models have to be inferred from data, we also need to quantify the associated parametric uncertainty. For non-trivial stochastic models, however, this latter step is typically very demanding, both conceptually and numerically, and always never done in hydrology. Here, we demonstrate that methods developed in statistical physics make a large class of stochastic differential equation (SDE) models amenable to a full-fledged Bayesian parameter inference. For concreteness we demonstrate these methods by means of a simple yet non-trivial toy SDE model. We consider a natural catchment that can be described by a linear reservoir, at the scale of observation. All the neglected processes are assumed to happen at much shorter time-scales and are therefore modeled with a Gaussian white noise term, the standard deviation of which is assumed to scale linearly with the system state (water volume in the catchment). Even for constant input, the outputs of this simple non-linear SDE model show a wealth of desirable statistical properties, such as fat-tailed distributions and long-range correlations. Standard algorithms for Bayesian inference fail, for models of this kind, because their likelihood functions are extremely high-dimensional intractable integrals over all possible model realizations. The use of Kalman filters is illegitimate due to the non-linearity of the model. Particle filters could be used but become increasingly inefficient with growing number of data points. Hamiltonian Monte Carlo algorithms allow us to translate this inference problem to the problem of simulating the dynamics of a statistical mechanics system and give us access to most sophisticated methods that have been developed in the statistical physics community over the last few decades. We demonstrate that such methods, along with automated differentiation algorithms, allow us to perform a full-fledged Bayesian inference, for a large class of SDE models, in a highly efficient and largely automatized manner. Furthermore, our algorithm is highly parallelizable. For our toy model, discretized with a few hundred points, a full Bayesian inference can be performed in a matter of seconds on a standard PC.
CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study. PMID:24037076
PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, J. Berian, E-mail: berian@berkeley.edu
2012-05-20
We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity thatmore » appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.« less
ERIC Educational Resources Information Center
Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael
2017-01-01
The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…
NASA Astrophysics Data System (ADS)
Pavlis, Nikolaos K.
Geomatics is a trendy term that has been used in recent years to describe academic departments that teach and research theories, methods, algorithms, and practices used in processing and analyzing data related to the Earth and other planets. Naming trends aside, geomatics could be considered as the mathematical and statistical “toolbox” that allows Earth scientists to extract information about physically relevant parameters from the available data and accompany such information with some measure of its reliability. This book is an attempt to present the mathematical-statistical methods used in data analysis within various disciplines—geodesy, geophysics, photogrammetry and remote sensing—from a unifying perspective that inverse problem formalism permits. At the same time, it allows us to stretch the relevance of statistical methods in achieving an optimal solution.
Statistical Methods Applied to Gamma-ray Spectroscopy Algorithms in Nuclear Security Missions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fagan, Deborah K.; Robinson, Sean M.; Runkle, Robert C.
2012-10-01
In a wide range of nuclear security missions, gamma-ray spectroscopy is a critical research and development priority. One particularly relevant challenge is the interdiction of special nuclear material for which gamma-ray spectroscopy supports the goals of detecting and identifying gamma-ray sources. This manuscript examines the existing set of spectroscopy methods, attempts to categorize them by the statistical methods on which they rely, and identifies methods that have yet to be considered. Our examination shows that current methods effectively estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty—ones that are significantly moremore » complex. We thus explore the premise that significantly improving algorithm performance requires greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods have the potential to reduce decision uncertainty by more rigorously and comprehensively incorporating all sources of uncertainty. We expect that application of such methods will demonstrate progress in meeting the needs of nuclear security missions by improving on the existing numerical infrastructure for which these analyses have not been conducted.« less
Carvalho, Suzana Papile Maciel; Brito, Liz Magalhães; Paiva, Luiz Airton Saavedra de; Bicudo, Lucilene Arilho Ribeiro; Crosato, Edgard Michel; Oliveira, Rogério Nogueira de
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study.
Statistical dependency in visual scanning
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Stark, Lawrence
1986-01-01
A method to identify statistical dependencies in the positions of eye fixations is developed and applied to eye movement data from subjects who viewed dynamic displays of air traffic and judged future relative position of aircraft. Analysis of approximately 23,000 fixations on points of interest on the display identified statistical dependencies in scanning that were independent of the physical placement of the points of interest. Identification of these dependencies is inconsistent with random-sampling-based theories used to model visual search and information seeking.
Cocco, Simona; Leibler, Stanislas; Monasson, Rémi
2009-01-01
Complexity of neural systems often makes impracticable explicit measurements of all interactions between their constituents. Inverse statistical physics approaches, which infer effective couplings between neurons from their spiking activity, have been so far hindered by their computational complexity. Here, we present 2 complementary, computationally efficient inverse algorithms based on the Ising and “leaky integrate-and-fire” models. We apply those algorithms to reanalyze multielectrode recordings in the salamander retina in darkness and under random visual stimulus. We find strong positive couplings between nearby ganglion cells common to both stimuli, whereas long-range couplings appear under random stimulus only. The uncertainty on the inferred couplings due to limitations in the recordings (duration, small area covered on the retina) is discussed. Our methods will allow real-time evaluation of couplings for large assemblies of neurons. PMID:19666487
Discrete approach to stochastic parametrization and dimension reduction in nonlinear dynamics.
Chorin, Alexandre J; Lu, Fei
2015-08-11
Many physical systems are described by nonlinear differential equations that are too complicated to solve in full. A natural way to proceed is to divide the variables into those that are of direct interest and those that are not, formulate solvable approximate equations for the variables of greater interest, and use data and statistical methods to account for the impact of the other variables. In the present paper we consider time-dependent problems and introduce a fully discrete solution method, which simplifies both the analysis of the data and the numerical algorithms. The resulting time series are identified by a NARMAX (nonlinear autoregression moving average with exogenous input) representation familiar from engineering practice. The connections with the Mori-Zwanzig formalism of statistical physics are discussed, as well as an application to the Lorenz 96 system.
Jets and Metastability in Quantum Mechanics and Quantum Field Theory
NASA Astrophysics Data System (ADS)
Farhi, David
I give a high level overview of the state of particle physics in the introduction, accessible without any background in the field. I discuss improvements of theoretical and statistical methods used for collider physics. These include telescoping jets, a statistical method which was claimed to allow jet searches to increase their sensitivity by considering several interpretations of each event. We find that indeed multiple interpretations extend the power of searches, for both simple counting experiments and powerful multivariate fitting experiments, at least for h → bb¯ at the LHC. Then I propose a method for automation of background calculations using SCET by appropriating the technology of Monte Carlo generators such as MadGraph. In the third chapter I change gears and discuss the future of the universe. It has long been known that our pocket of the standard model is unstable; there is a lower-energy configuration in a remote part of the configuration space, to which our universe will, eventually, decay. While the timescales involved are on the order of 10400 years (depending on how exactly one counts) and thus of no immediate worry, I discuss the shortcomings of the standard methods and propose a more physically motivated derivation for the decay rate. I then make various observations about the structure of decays in quantum field theory.
Vibroacoustic optimization using a statistical energy analysis model
NASA Astrophysics Data System (ADS)
Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia
2016-08-01
In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.
Statistical Physics of Adaptation
2016-08-23
Statistical Physics of Adaptation Nikolay Perunov, Robert A. Marsland, and Jeremy L. England Department of Physics , Physics of Living Systems Group...Subject Areas: Biological Physics , Complex Systems, Statistical Physics I. INTRODUCTION It has long been understood that nonequilibrium driving can...equilibrium may appear to have been specially selected for physical properties connected to their ability to absorb work from the particular driving environment
Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E
2015-03-01
Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.
NASA Astrophysics Data System (ADS)
Gao, Jike
2018-01-01
Through using the method of literature review, instrument measuring, questionnaire and mathematical statistics, this paper analyzed the current situation in Mass Sports of Tibetan Areas Plateau in Gansu Province. Through experimental test access to Tibetan areas in gansu province of air pollutants and meteorological index data as the foundation, control related national standard and exercise science, statistical analysis of data, the Tibetan plateau, gansu province people participate in physical exercise is dedicated to providing you with scientific methods and appropriate time.
Physical activity in climacteric women: comparison between self-reporting and pedometer.
Colpani, Verônica; Spritzer, Poli Mara; Lodi, Ana Paula; Dorigo, Guilherme Gustavo; Miranda, Isabela Albuquerque Severo de; Hahn, Laiza Beck; Palludo, Luana Pedroso; Pietroski, Rafaela Lazzari; Oppermann, Karen
2014-04-01
To compare two methods of assessing physical activity in pre-, peri- and postmenopausal women. Cross-sectional study nested in a cohort of pre-, peri- and postmenopausal women in a city in Southern Brazil. The participants completed a questionnaire that included sociodemographic and clinical data. Physical activity was assessed using a digital pedometer and the International Physical Activity Questionnaire, short version. The participants were classified into strata of physical activity according to the instrument used. For statistical analysis, the Spearman correlation test, Kappa index, concordance coefficient and Bland-Altman plots were used. The concordance (k = 0110; p = 0.007) and the correlation (rho = 0.136, p = 0.02) between the International Physical Activity Questionnaire, short version, and pedometer were weak. In Bland-Altman plots, it was observed that differences deviate from zero value whether the physical activity is minimal or more intense. Comparing the two methods, the frequency of inactive women is higher when assessed by pedometer than by the International Physical Activity Questionnaire--short version, and the opposite occurs in active women. Agreement between the methods was weak. Although easy to use, Physical Activity Questionnaire--short version overestimates physical activity compared with assessment by pedometer.
Grbovic, Vesna; Jurisic-Skevin, Aleksandra; Djukic, Svetlana; Stefanović, Srdjan; Nurkovic, Jasmin
2016-01-01
[Purpose] Painful diabetic polyneuropathy occurs as a complication in 16% of all patients with diabetes mellitus. [Subjects and Methods] A clinical, prospective open-label randomized intervention study was conducted of 60 adult patients, with distal sensorimotor diabetic neuropathy two groups of 30 patients, with diabetes mellitus type 2 with distal sensorimotor diabetic neuropathy. Patients in group A were treated with combined physical procedures, and patients in group B were treated with alpha lipoic acid. [Results] There where a statistically significant improvements in terminal latency and the amplitude of the action potential in group A patients, while group B patients showed a statistically significant improvements in conduction velocity and terminal latency of n. peroneus. Group A patients showed a statistically significant improvements in conduction velocity and terminal latency, while group B patients also showed a statistically significant improvements in conduction velocity and terminal latency. This was reflected in a significant improvements in electrophysiological parameters (conduction velocity, amplitude and latency) of the motor and sensory nerves (n. peroneus, n. suralis). [Conclusion] These results present further evidence justifying of the use of physical agents in the treatment of diabetic sensorimotor polyneuropathy. PMID:27065527
A generalization of random matrix theory and its application to statistical physics.
Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H
2017-02-01
To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.
Nariai, N; Kim, S; Imoto, S; Miyano, S
2004-01-01
We propose a statistical method to estimate gene networks from DNA microarray data and protein-protein interactions. Because physical interactions between proteins or multiprotein complexes are likely to regulate biological processes, using only mRNA expression data is not sufficient for estimating a gene network accurately. Our method adds knowledge about protein-protein interactions to the estimation method of gene networks under a Bayesian statistical framework. In the estimated gene network, a protein complex is modeled as a virtual node based on principal component analysis. We show the effectiveness of the proposed method through the analysis of Saccharomyces cerevisiae cell cycle data. The proposed method improves the accuracy of the estimated gene networks, and successfully identifies some biological facts.
NASA Astrophysics Data System (ADS)
Adams, T.; Batra, P.; Bugel, L.; Camilleri, L.; Conrad, J. M.; de Gouvêa, A.; Fisher, P. H.; Formaggio, J. A.; Jenkins, J.; Karagiorgi, G.; Kobilarcik, T. R.; Kopp, S.; Kyle, G.; Loinaz, W. A.; Mason, D. A.; Milner, R.; Moore, R.; Morfín, J. G.; Nakamura, M.; Naples, D.; Nienaber, P.; Olness, F. I.; Owens, J. F.; Pate, S. F.; Pronin, A.; Seligman, W. G.; Shaevitz, M. H.; Schellman, H.; Schienbein, I.; Syphers, M. J.; Tait, T. M. P.; Takeuchi, T.; Tan, C. Y.; van de Water, R. G.; Yamamoto, R. K.; Yu, J. Y.
We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDF's). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parametrized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of "Beyond the Standard Model" physics.
Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Johan, E-mail: anderson.johan@gmail.com; Halpern, Federico D.; Ricci, Paolo
The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis ofmore » the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.« less
Inverse statistical physics of protein sequences: a key issues review.
Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin
2018-03-01
In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.
Inverse statistical physics of protein sequences: a key issues review
NASA Astrophysics Data System (ADS)
Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin
2018-03-01
In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.
Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John
2018-03-07
DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.
Solar Activity Heading for a Maunder Minimum?
NASA Astrophysics Data System (ADS)
Schatten, K. H.; Tobiska, W. K.
2003-05-01
Long-range (few years to decades) solar activity prediction techniques vary greatly in their methods. They range from examining planetary orbits, to spectral analyses (e.g. Fourier, wavelet and spectral analyses), to artificial intelligence methods, to simply using general statistical techniques. Rather than concentrate on statistical/mathematical/numerical methods, we discuss a class of methods which appears to have a "physical basis." Not only does it have a physical basis, but this basis is rooted in both "basic" physics (dynamo theory), but also solar physics (Babcock dynamo theory). The class we discuss is referred to as "precursor methods," originally developed by Ohl, Brown and Williams and others, using geomagnetic observations. My colleagues and I have developed some understanding for how these methods work and have expanded the prediction methods using "solar dynamo precursor" methods, notably a "SODA" index (SOlar Dynamo Amplitude). These methods are now based upon an understanding of the Sun's dynamo processes- to explain a connection between how the Sun's fields are generated and how the Sun broadcasts its future activity levels to Earth. This has led to better monitoring of the Sun's dynamo fields and is leading to more accurate prediction techniques. Related to the Sun's polar and toroidal magnetic fields, we explain how these methods work, past predictions, the current cycle, and predictions of future of solar activity levels for the next few solar cycles. The surprising result of these long-range predictions is a rapid decline in solar activity, starting with cycle #24. If this trend continues, we may see the Sun heading towards a "Maunder" type of solar activity minimum - an extensive period of reduced levels of solar activity. For the solar physicists, who enjoy studying solar activity, we hope this isn't so, but for NASA, which must place and maintain satellites in low earth orbit (LEO), it may help with reboost problems. Space debris, and other aspects of objects in LEO will also be affected. This research is supported by the NSF and NASA.
An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.
2017-01-01
Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.
SPIPS: Spectro-Photo-Interferometry of Pulsating Stars
NASA Astrophysics Data System (ADS)
Mérand, Antoine
2017-10-01
SPIPS (Spectro-Photo-Interferometry of Pulsating Stars) combines radial velocimetry, interferometry, and photometry to estimate physical parameters of pulsating stars, including presence of infrared excess, color excess, Teff, and ratio distance/p-factor. The global model-based parallax-of-pulsation method is implemented in Python. Derived parameters have a high level of confidence; statistical precision is improved (compared to other methods) due to the large number of data taken into account, accuracy is improved by using consistent physical modeling and reliability of the derived parameters is strengthened by redundancy in the data.
Applications of modern statistical methods to analysis of data in physical science
NASA Astrophysics Data System (ADS)
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.
Revision by means of computer-mediated peer discussions
NASA Astrophysics Data System (ADS)
Soong, Benson; Mercer, Neil; Er, Siew Shin
2010-05-01
In this article, we provide a discussion on our revision method (termed prescriptive tutoring) aimed at revealing students' misconceptions and misunderstandings by getting them to solve physics problems with an anonymous partner via the computer. It is currently being implemented and evaluated in a public secondary school in Singapore, and statistical analysis of our initial small-scale study shows that students in the experimental group significantly outperformed students in both the control and alternative intervention groups. In addition, students in the experimental group perceived that they had gained improved understanding of the physics concepts covered during the intervention, and reported that they would like to continue revising physics concepts using the intervention methods.
NASA Astrophysics Data System (ADS)
Röpke, G.
2018-01-01
One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.
The Standard Model in the history of the Natural Sciences, Econometrics, and the social sciences
NASA Astrophysics Data System (ADS)
Fisher, W. P., Jr.
2010-07-01
In the late 18th and early 19th centuries, scientists appropriated Newton's laws of motion as a model for the conduct of any other field of investigation that would purport to be a science. This early form of a Standard Model eventually informed the basis of analogies for the mathematical expression of phenomena previously studied qualitatively, such as cohesion, affinity, heat, light, electricity, and magnetism. James Clerk Maxwell is known for his repeated use of a formalized version of this method of analogy in lectures, teaching, and the design of experiments. Economists transferring skills learned in physics made use of the Standard Model, especially after Maxwell demonstrated the value of conceiving it in abstract mathematics instead of as a concrete and literal mechanical analogy. Haavelmo's probability approach in econometrics and R. Fisher's Statistical Methods for Research Workers brought a statistical approach to bear on the Standard Model, quietly reversing the perspective of economics and the social sciences relative to that of physics. Where physicists, and Maxwell in particular, intuited scientific method as imposing stringent demands on the quality and interrelations of data, instruments, and theory in the name of inferential and comparative stability, statistical models and methods disconnected theory from data by removing the instrument as an essential component. New possibilities for reconnecting economics and the social sciences to Maxwell's sense of the method of analogy are found in Rasch's probabilistic models for measurement.
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
Spectral-Lagrangian methods for collisional models of non-equilibrium statistical states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamba, Irene M.; Tharkabhushanam, Sri Harsha
We propose a new spectral Lagrangian based deterministic solver for the non-linear Boltzmann transport equation (BTE) in d-dimensions for variable hard sphere (VHS) collision kernels with conservative or non-conservative binary interactions. The method is based on symmetries of the Fourier transform of the collision integral, where the complexity in its computation is reduced to a separate integral over the unit sphere S{sup d-1}. The conservation of moments is enforced by Lagrangian constraints. The resulting scheme, implemented in free space, is very versatile and adjusts in a very simple manner to several cases that involve energy dissipation due to local micro-reversibilitymore » (inelastic interactions) or elastic models of slowing down process. Our simulations are benchmarked with available exact self-similar solutions, exact moment equations and analytical estimates for the homogeneous Boltzmann equation, both for elastic and inelastic VHS interactions. Benchmarking of the simulations involves the selection of a time self-similar rescaling of the numerical distribution function which is performed using the continuous spectrum of the equation for Maxwell molecules as studied first in Bobylev et al. [A.V. Bobylev, C. Cercignani, G. Toscani, Proof of an asymptotic property of self-similar solutions of the Boltzmann equation for granular materials, Journal of Statistical Physics 111 (2003) 403-417] and generalized to a wide range of related models in Bobylev et al. [A.V. Bobylev, C. Cercignani, I.M. Gamba, On the self-similar asymptotics for generalized non-linear kinetic Maxwell models, Communication in Mathematical Physics, in press. URL: (
Mean-field approximation for spacing distribution functions in classical systems.
González, Diego Luis; Pimpinelli, Alberto; Einstein, T L
2012-01-01
We propose a mean-field method to calculate approximately the spacing distribution functions p((n))(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p((n))(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed. © 2012 American Physical Society
Dynamic principle for ensemble control tools.
Samoletov, A; Vasiev, B
2017-11-28
Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kubic, William Louis; Jenkins, Rhodri W.; Moore, Cameron M.
Chemical pathways for converting biomass into fuels produce compounds for which key physical and chemical property data are unavailable. We developed an artificial neural network based group contribution method for estimating cetane and octane numbers that captures the complex dependence of fuel properties of pure compounds on chemical structure and is statistically superior to current methods.
Student Solution Manual for Essential Mathematical Methods for the Physical Sciences
NASA Astrophysics Data System (ADS)
Riley, K. F.; Hobson, M. P.
2011-02-01
1. Matrices and vector spaces; 2. Vector calculus; 3. Line, surface and volume integrals; 4. Fourier series; 5. Integral transforms; 6. Higher-order ODEs; 7. Series solutions of ODEs; 8. Eigenfunction methods; 9. Special functions; 10. Partial differential equations; 11. Solution methods for PDEs; 12. Calculus of variations; 13. Integral equations; 14. Complex variables; 15. Applications of complex variables; 16. Probability; 17. Statistics.
Essential Mathematical Methods for the Physical Sciences
NASA Astrophysics Data System (ADS)
Riley, K. F.; Hobson, M. P.
2011-02-01
1. Matrices and vector spaces; 2. Vector calculus; 3. Line, surface and volume integrals; 4. Fourier series; 5. Integral transforms; 6. Higher-order ODEs; 7. Series solutions of ODEs; 8. Eigenfunction methods; 9. Special functions; 10. Partial differential equations; 11. Solution methods for PDEs; 12. Calculus of variations; 13. Integral equations; 14. Complex variables; 15. Applications of complex variables; 16. Probability; 17. Statistics; Appendices; Index.
Kubic, William Louis; Jenkins, Rhodri W.; Moore, Cameron M.; ...
2017-09-28
Chemical pathways for converting biomass into fuels produce compounds for which key physical and chemical property data are unavailable. We developed an artificial neural network based group contribution method for estimating cetane and octane numbers that captures the complex dependence of fuel properties of pure compounds on chemical structure and is statistically superior to current methods.
Physical activity level and fall risk among community-dwelling older adults.
Low, Sok Teng; Balaraman, Thirumalaya
2017-07-01
[Purpose] To find the physical activity level and fall risk among the community-dwelling Malaysian older adults and determine the correlation between them. [Subjects and Methods] A cross-sectional study was conducted in which, the physical activity level was evaluated using the Rapid Assessment of Physical Activity questionnaire and fall risk with Fall Risk Assessment Tool. Subjects recruited were 132 community-dwelling Malaysian older adults using the convenience sampling method. [Results] The majority of the participants were under the category of under-active regular light-activities and most of them reported low fall risk. The statistical analysis using Fisher's exact test did not show a significant correlation between physical activity level and fall risk. [Conclusion] The majority of community-dwelling Malaysian older adults are performing some form of physical activity and in low fall risk category. But this study did not find any significant correlation between physical activity level and fall risk among community-dwelling older adults in Malaysia.
Accelerated battery-life testing - A concept
NASA Technical Reports Server (NTRS)
Mccallum, J.; Thomas, R. E.
1971-01-01
Test program, employing empirical, statistical and physical methods, determines service life and failure probabilities of electrochemical cells and batteries, and is applicable to testing mechanical, electrical, and chemical devices. Data obtained aids long-term performance prediction of battery or cell.
Wingate, Savanna; Sng, Eveleen; Loprinzi, Paul D
2018-01-01
Background: The purpose of this study was to evaluate the extent, if any, that the association between socio-ecological parameters and physical activity may be influenced by common method bias (CMB). Methods: This study took place between February and May of 2017 at a Southeastern University in the United States. A randomized controlled experiment was employed among 119 young adults.Participants were randomized into either group 1 (the group we attempted to minimize CMB)or group 2 (control group). In group 1, CMB was minimized via various procedural remedies,such as separating the measurement of predictor and criterion variables by introducing a time lag (temporal; 2 visits several days apart), creating a cover story (psychological), and approximating measures to have data collected in different media (computer-based vs. paper and pencil) and different locations to control method variance when collecting self-report measures from the same source. Socio-ecological parameters (self-efficacy; friend support; family support)and physical activity were self-reported. Results: Exercise self-efficacy was significantly associated with physical activity. This association (β = 0.74, 95% CI: 0.33-1.1; P = 0.001) was only observed in group 2 (control), but not in group 1 (experimental group) (β = 0.03; 95% CI: -0.57-0.63; P = 0.91). The difference in these coefficients (i.e., β = 0.74 vs. β = 0.03) was statistically significant (P = 0.04). Conclusion: Future research in this field, when feasible, may wish to consider employing procedural and statistical remedies to minimize CMB.
Wingate, Savanna; Sng, Eveleen; Loprinzi, Paul D.
2018-01-01
Background: The purpose of this study was to evaluate the extent, if any, that the association between socio-ecological parameters and physical activity may be influenced by common method bias (CMB). Methods: This study took place between February and May of 2017 at a Southeastern University in the United States. A randomized controlled experiment was employed among 119 young adults.Participants were randomized into either group 1 (the group we attempted to minimize CMB)or group 2 (control group). In group 1, CMB was minimized via various procedural remedies,such as separating the measurement of predictor and criterion variables by introducing a time lag (temporal; 2 visits several days apart), creating a cover story (psychological), and approximating measures to have data collected in different media (computer-based vs. paper and pencil) and different locations to control method variance when collecting self-report measures from the same source. Socio-ecological parameters (self-efficacy; friend support; family support)and physical activity were self-reported. Results: Exercise self-efficacy was significantly associated with physical activity. This association (β = 0.74, 95% CI: 0.33-1.1; P = 0.001) was only observed in group 2 (control), but not in group 1 (experimental group) (β = 0.03; 95% CI: -0.57-0.63; P = 0.91). The difference in these coefficients (i.e., β = 0.74 vs. β = 0.03) was statistically significant (P = 0.04). Conclusion: Future research in this field, when feasible, may wish to consider employing procedural and statistical remedies to minimize CMB. PMID:29423361
A new statistical method for characterizing the atmospheres of extrasolar planets
NASA Astrophysics Data System (ADS)
Henderson, Cassandra S.; Skemer, Andrew J.; Morley, Caroline V.; Fortney, Jonathan J.
2017-10-01
By detecting light from extrasolar planets, we can measure their compositions and bulk physical properties. The technologies used to make these measurements are still in their infancy, and a lack of self-consistency suggests that previous observations have underestimated their systemic errors. We demonstrate a statistical method, newly applied to exoplanet characterization, which uses a Bayesian formalism to account for underestimated errorbars. We use this method to compare photometry of a substellar companion, GJ 758b, with custom atmospheric models. Our method produces a probability distribution of atmospheric model parameters including temperature, gravity, cloud model (fsed) and chemical abundance for GJ 758b. This distribution is less sensitive to highly variant data and appropriately reflects a greater uncertainty on parameter fits.
Validation of Physics Standardized Test Items
NASA Astrophysics Data System (ADS)
Marshall, Jill
2008-10-01
The Texas Physics Assessment Team (TPAT) examined the Texas Assessment of Knowledge and Skills (TAKS) to determine whether it is a valid indicator of physics preparation for future course work and employment, and of the knowledge and skills needed to act as an informed citizen in a technological society. We categorized science items from the 2003 and 2004 10th and 11th grade TAKS by content area(s) covered, knowledge and skills required to select the correct answer, and overall quality. We also analyzed a 5000 student sample of item-level results from the 2004 11th grade exam using standard statistical methods employed by test developers (factor analysis and Item Response Theory). Triangulation of our results revealed strengths and weaknesses of the different methods of analysis. The TAKS was found to be only weakly indicative of physics preparation and we make recommendations for increasing the validity of standardized physics testing..
Statistical physics of human cooperation
NASA Astrophysics Data System (ADS)
Perc, Matjaž; Jordan, Jillian J.; Rand, David G.; Wang, Zhen; Boccaletti, Stefano; Szolnoki, Attila
2017-05-01
Extensive cooperation among unrelated individuals is unique to humans, who often sacrifice personal benefits for the common good and work together to achieve what they are unable to execute alone. The evolutionary success of our species is indeed due, to a large degree, to our unparalleled other-regarding abilities. Yet, a comprehensive understanding of human cooperation remains a formidable challenge. Recent research in the social sciences indicates that it is important to focus on the collective behavior that emerges as the result of the interactions among individuals, groups, and even societies. Non-equilibrium statistical physics, in particular Monte Carlo methods and the theory of collective behavior of interacting particles near phase transition points, has proven to be very valuable for understanding counterintuitive evolutionary outcomes. By treating models of human cooperation as classical spin models, a physicist can draw on familiar settings from statistical physics. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among humans often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. The complexity of solutions therefore often surpasses that observed in physical systems. Here we review experimental and theoretical research that advances our understanding of human cooperation, focusing on spatial pattern formation, on the spatiotemporal dynamics of observed solutions, and on self-organization that may either promote or hinder socially favorable states.
Wave propagation in a random medium
NASA Technical Reports Server (NTRS)
Lee, R. W.; Harp, J. C.
1969-01-01
A simple technique is used to derive statistical characterizations of the perturbations imposed upon a wave (plane, spherical or beamed) propagating through a random medium. The method is essentially physical rather than mathematical, and is probably equivalent to the Rytov method. The limitations of the method are discussed in some detail; in general they are restrictive only for optical paths longer than a few hundred meters, and for paths at the lower microwave frequencies. Situations treated include arbitrary path geometries, finite transmitting and receiving apertures, and anisotropic media. Results include, in addition to the usual statistical quantities, time-lagged functions, mixed functions involving amplitude and phase fluctuations, angle-of-arrival covariances, frequency covariances, and other higher-order quantities.
Information geometric methods for complexity
NASA Astrophysics Data System (ADS)
Felice, Domenico; Cafaro, Carlo; Mancini, Stefano
2018-03-01
Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.
Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping
2013-01-01
Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.
15 CFR 200.103 - Consulting and advisory services.
Code of Federal Regulations, 2013 CFR
2013-01-01
...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...
15 CFR 200.103 - Consulting and advisory services.
Code of Federal Regulations, 2011 CFR
2011-01-01
...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...
[Comparison among various software for LMS growth curve fitting methods].
Han, Lin; Wu, Wenhong; Wei, Qiuxia
2015-03-01
To explore the methods to realize the growth curve fitting of coefficients of skewness-median-coefficient of variation (LMS) using different software, and to optimize growth curve statistical method for grass-root child and adolescent staffs. Regular physical examination data of head circumference for normal infants aging 3, 6, 9 and 12 months in Baotou City were analyzed. Statistical software such as SAS, R, STATA and SPSS were used to fit the LMS growth curve and the results were evaluated upon the user 's convenience, study circle, user interface, results display forms, software update and maintenance and so on. Growth curve fitting results showed the same calculation outcome and each of statistical software had its own advantages and disadvantages. With all the evaluation aspects in consideration, R software excelled others in LMS growth curve fitting. R software have the advantage over other software in grass roots child and adolescent staff.
Didarloo, Alireza; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad
2011-01-01
Background Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. Methods A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. Results The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Conclusion Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease. PMID:22111043
UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.
Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois
2018-03-01
Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.
Bayesian statistics in radionuclide metrology: measurement of a decaying source
NASA Astrophysics Data System (ADS)
Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal
2007-08-01
The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.
Applications of statistical physics to technology price evolution
NASA Astrophysics Data System (ADS)
McNerney, James
Understanding how changing technology affects the prices of goods is a problem with both rich phenomenology and important policy consequences. Using methods from statistical physics, I model technology-driven price evolution. First, I examine a model for the price evolution of individual technologies. The price of a good often follows a power law equation when plotted against its cumulative production. This observation turns out to have significant consequences for technology policy aimed at mitigating climate change, where technologies are needed that achieve low carbon emissions at low cost. However, no theory adequately explains why technology prices follow power laws. To understand this behavior, I simplify an existing model that treats technologies as machines composed of interacting components. I find that the power law exponent of the price trajectory is inversely related to the number of interactions per component. I extend the model to allow for more realistic component interactions and make a testable prediction. Next, I conduct a case-study on the cost evolution of coal-fired electricity. I derive the cost in terms of various physical and economic components. The results suggest that commodities and technologies fall into distinct classes of price models, with commodities following martingales, and technologies following exponentials in time or power laws in cumulative production. I then examine the network of money flows between industries. This work is a precursor to studying the simultaneous evolution of multiple technologies. Economies resemble large machines, with different industries acting as interacting components with specialized functions. To begin studying the structure of these machines, I examine 20 economies with an emphasis on finding common features to serve as targets for statistical physics models. I find they share the same money flow and industry size distributions. I apply methods from statistical physics to show that industries cluster the same way according to industry type. Finally, I use these industry money flows to model the price evolution of many goods simultaneously, where network effects become important. I derive a prediction for which goods tend to improve most rapidly. The fastest-improving goods are those with the highest mean path lengths in the money flow network.
Theoretical approaches to the steady-state statistical physics of interacting dissipative units
NASA Astrophysics Data System (ADS)
Bertin, Eric
2017-02-01
The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.
Review of surface steam sterilization for validation purposes.
van Doornmalen, Joost; Kopinga, Klaas
2008-03-01
Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.
Compositional data analysis for physical activity, sedentary time and sleep research.
Dumuid, Dorothea; Stanford, Tyman E; Martin-Fernández, Josep-Antoni; Pedišić, Željko; Maher, Carol A; Lewis, Lucy K; Hron, Karel; Katzmarzyk, Peter T; Chaput, Jean-Philippe; Fogelholm, Mikael; Hu, Gang; Lambert, Estelle V; Maia, José; Sarmiento, Olga L; Standage, Martyn; Barreira, Tiago V; Broyles, Stephanie T; Tudor-Locke, Catrine; Tremblay, Mark S; Olds, Timothy
2017-01-01
The health effects of daily activity behaviours (physical activity, sedentary time and sleep) are widely studied. While previous research has largely examined activity behaviours in isolation, recent studies have adjusted for multiple behaviours. However, the inclusion of all activity behaviours in traditional multivariate analyses has not been possible due to the perfect multicollinearity of 24-h time budget data. The ensuing lack of adjustment for known effects on the outcome undermines the validity of study findings. We describe a statistical approach that enables the inclusion of all daily activity behaviours, based on the principles of compositional data analysis. Using data from the International Study of Childhood Obesity, Lifestyle and the Environment, we demonstrate the application of compositional multiple linear regression to estimate adiposity from children's daily activity behaviours expressed as isometric log-ratio coordinates. We present a novel method for predicting change in a continuous outcome based on relative changes within a composition, and for calculating associated confidence intervals to allow for statistical inference. The compositional data analysis presented overcomes the lack of adjustment that has plagued traditional statistical methods in the field, and provides robust and reliable insights into the health effects of daily activity behaviours.
Garriguet, Didier; Colley, Rachel C
2014-07-01
Systematic reviews and results of Statistics Canada surveys have shown a discrepancy between self-reported and measured physical activity. This study compares these two methods and examines specific activities to explain the limitations of each method. Data are from cycle 1 (2007 to 2009) and cycle 2 (2009 to 2011) of the Canadian Health Measures Survey. The survey involved an interview in the respondent's home and a visit to a mobile examination centre (MEC) for physical measurements. In a questionnaire, respondents were asked about 21 leisure-time physical activities. They were requested to wear an Actical accelerometer for seven days after the MEC visit. The analysis pertains to respondents aged 12 to 79 who wore the accelerometer for 10 or more hours on at least four days (n = 7,158). Averages of self-reported leisure-time physical activity and moderate-to-vigorous physical activity measured by accelerometer were within a couple of minutes of each other. However, at the individual level, the difference between estimates could exceed 37.5 minutes per day in one direction or the other, and around 40% of the population met physical activity thresholds according to one measurement method, but not according to the other. The disagreement is supported by weak observed correlations. The lack of a systematic trend in the relationship between the two methods of measuring physical activity precludes the creation of correction factors or being confident in using one method instead of the other. Accelerometers and questionnaires measure different aspects of physical activity.
Predicting Physical Interactions between Protein Complexes*
Clancy, Trevor; Rødland, Einar Andreas; Nygard, Ståle; Hovig, Eivind
2013-01-01
Protein complexes enact most biochemical functions in the cell. Dynamic interactions between protein complexes are frequent in many cellular processes. As they are often of a transient nature, they may be difficult to detect using current genome-wide screens. Here, we describe a method to computationally predict physical interactions between protein complexes, applied to both humans and yeast. We integrated manually curated protein complexes and physical protein interaction networks, and we designed a statistical method to identify pairs of protein complexes where the number of protein interactions between a complex pair is due to an actual physical interaction between the complexes. An evaluation against manually curated physical complex-complex interactions in yeast revealed that 50% of these interactions could be predicted in this manner. A community network analysis of the highest scoring pairs revealed a biologically sensible organization of physical complex-complex interactions in the cell. Such analyses of proteomes may serve as a guide to the discovery of novel functional cellular relationships. PMID:23438732
Absolute mass scale calibration in the inverse problem of the physical theory of fireballs.
NASA Astrophysics Data System (ADS)
Kalenichenko, V. V.
A method of the absolute mass scale calibration is suggested for solving the inverse problem of the physical theory of fireballs. The method is based on the data on the masses of the fallen meteorites whose fireballs have been photographed in their flight. The method may be applied to those fireballs whose bodies have not experienced considerable fragmentation during their destruction in the atmosphere and have kept their form well enough. Statistical analysis of the inverse problem solution for a sufficiently representative sample makes it possible to separate a subsample of such fireballs. The data on the Lost City and Innisfree meteorites are used to obtain calibration coefficients.
Physical Uncertainty Bounds (PUB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less
Tilson, Julie K; Marshall, Katie; Tam, Jodi J; Fetters, Linda
2016-04-22
A primary barrier to the implementation of evidence based practice (EBP) in physical therapy is therapists' limited ability to understand and interpret statistics. Physical therapists demonstrate limited skills and report low self-efficacy for interpreting results of statistical procedures. While standards for physical therapist education include statistics, little empirical evidence is available to inform what should constitute such curricula. The purpose of this study was to conduct a census of the statistical terms and study designs used in physical therapy literature and to use the results to make recommendations for curricular development in physical therapist education. We conducted a bibliometric analysis of 14 peer-reviewed journals associated with the American Physical Therapy Association over 12 months (Oct 2011-Sept 2012). Trained raters recorded every statistical term appearing in identified systematic reviews, primary research reports, and case series and case reports. Investigator-reported study design was also recorded. Terms representing the same statistical test or concept were combined into a single, representative term. Cumulative percentage was used to identify the most common representative statistical terms. Common representative terms were organized into eight categories to inform curricular design. Of 485 articles reviewed, 391 met the inclusion criteria. These 391 articles used 532 different terms which were combined into 321 representative terms; 13.1 (sd = 8.0) terms per article. Eighty-one representative terms constituted 90% of all representative term occurrences. Of the remaining 240 representative terms, 105 (44%) were used in only one article. The most common study design was prospective cohort (32.5%). Physical therapy literature contains a large number of statistical terms and concepts for readers to navigate. However, in the year sampled, 81 representative terms accounted for 90% of all occurrences. These "common representative terms" can be used to inform curricula to promote physical therapists' skills, competency, and confidence in interpreting statistics in their professional literature. We make specific recommendations for curriculum development informed by our findings.
Scientific computations section monthly report, November 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckner, M.R.
1993-12-30
This progress report from the Savannah River Technology Center contains abstracts from papers from the computational modeling, applied statistics, applied physics, experimental thermal hydraulics, and packaging and transportation groups. Specific topics covered include: engineering modeling and process simulation, criticality methods and analysis, plutonium disposition.
Results of a joint NOAA/NASA sounder simulation study
NASA Technical Reports Server (NTRS)
Phillips, N.; Susskind, Joel; Mcmillin, L.
1988-01-01
This paper presents the results of a joint NOAA and NASA sounder simulation study in which the accuracies of atmospheric temperature profiles and surface skin temperature measuremnents retrieved from two sounders were compared: (1) the currently used IR temperature sounder HIRS2 (High-resolution Infrared Radiation Sounder 2); and (2) the recently proposed high-spectral-resolution IR sounder AMTS (Advanced Moisture and Temperature Sounder). Simulations were conducted for both clear and partial cloud conditions. Data were analyzed at NASA using a physical inversion technique and at NOAA using a statistical technique. Results show significant improvement of AMTS compared to HIRS2 for both clear and cloudy conditions. The improvements are indicated by both methods of data analysis, but the physical retrievals outperform the statistical retrievals.
Statistical Description of Associative Memory
NASA Astrophysics Data System (ADS)
Samengo, Inés
2003-03-01
The storage of memories, in the brain, induces some kind of modification in the structural and functional properties of a neural network. Here, a few neuropsychological and neurophysiological experiments are reviewed, suggesting that the plastic changes taking place during memory storage are governed, among other things, by the correlations in the activity of a set of neurons. The Hopfield model is briefly described, showing the way the methods of statistical physics can be useful to describe the storage and retrieval of memories.
A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.
Revell, Christopher; Somveille, Marius
2017-08-29
In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.
From the necessary to the possible: the genesis of the spin-statistics theorem
NASA Astrophysics Data System (ADS)
Blum, Alexander
2014-12-01
The spin-statistics theorem, which relates the intrinsic angular momentum of a single particle to the type of quantum statistics obeyed by a system of many such particles, is one of the central theorems in quantum field theory and the physics of elementary particles. It was first formulated in 1939/40 by Wolfgang Pauli and his assistant Markus Fierz. This paper discusses the developments that led up to this first formulation, starting from early attempts in the late 1920s to explain why charged matter particles obey Fermi-Dirac statistics, while photons obey Bose-Einstein statistics. It is demonstrated how several important developments paved the way from such general philosophical musings to a general (and provable) theorem, most notably the use of quantum field theory, the discovery of new elementary particles, and the generalization of the notion of spin. It is also discussed how the attempts to prove a spin-statistics connection were driven by Pauli from formal to more physical arguments, culminating in Pauli's 1940 proof. This proof was a major success for the beleaguered theory of quantum field theory and the methods Pauli employed proved essential for the renaissance of quantum field theory and the development of renormalization techniques in the late 1940s.
High-performance parallel computing in the classroom using the public goods game as an example
NASA Astrophysics Data System (ADS)
Perc, Matjaž
2017-07-01
The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.
Quantifying economic fluctuations by adapting methods of statistical physics
NASA Astrophysics Data System (ADS)
Plerou, Vasiliki
2001-09-01
The first focus of this thesis is the investigation of cross-correlations between the price fluctuations of different stocks using the conceptual framework of random matrix theory (RMT), developed in physics to describe the statistical properties of energy-level spectra of complex nuclei. RMT makes predictions for the statistical properties of matrices that are universal, i.e., do not depend on the interactions between the elements comprising the system. In physical systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system so this framework is of potential value if applied to economic systems. This thesis compares the statistics of cross-correlation matrix
2015-07-15
Long-term effects on cancer survivors’ quality of life of physical training versus physical training combined with cognitive-behavioral therapy ...COMPARISON OF NEURAL NETWORK AND LINEAR REGRESSION MODELS IN STATISTICALLY PREDICTING MENTAL AND PHYSICAL HEALTH STATUS OF BREAST...34Comparison of Neural Network and Linear Regression Models in Statistically Predicting Mental and Physical Health Status of Breast Cancer Survivors
Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error
NASA Astrophysics Data System (ADS)
Miller, Austin
In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.
Spectral Analysis of B Stars: An Application of Bayesian Statistics
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2012-12-01
To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.
Statistical mechanics of broadcast channels using low-density parity-check codes.
Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David
2003-03-01
We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.
Detector noise statistics in the non-linear regime
NASA Technical Reports Server (NTRS)
Shopbell, P. L.; Bland-Hawthorn, J.
1992-01-01
The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.
ELEVEN BROADCASTING EXPERIMENTS.
ERIC Educational Resources Information Center
PERRATON, HILARY D.
A REVIEW IS MADE OF EXPERIMENTAL COURSES COMBINING THE USE OF RADIO, TELEVISION, AND CORRESPONDENCE STUDY AND GIVEN BY THE NATIONAL EXTENSION COLLEGE IN ENGLAND. COURSES INCLUDED ENGLISH, MATHEMATICS, SOCIAL WORK, PHYSICS, STATISTICS, AND COMPUTERS. TWO METHODS OF LINKING CORRESPONDENCE COURSES TO BROADCASTS WERE USED--IN MATHEMATICS AND SOCIAL…
Treatment of Chemical Equilibrium without Using Thermodynamics or Statistical Mechanics.
ERIC Educational Resources Information Center
Nelson, P. G.
1986-01-01
Discusses the conventional approaches to teaching about chemical equilibrium in advanced physical chemistry courses. Presents an alternative approach to the treatment of this concept by using Boltzmann's distribution law. Lists five advantages to using this method as compared with the other approaches. (TW)
NASA Astrophysics Data System (ADS)
Roberts, B. M.; Blewitt, G.; Dailey, C.; Derevianko, A.
2018-04-01
We analyze the prospects of employing a distributed global network of precision measurement devices as a dark matter and exotic physics observatory. In particular, we consider the atomic clocks of the global positioning system (GPS), consisting of a constellation of 32 medium-Earth orbit satellites equipped with either Cs or Rb microwave clocks and a number of Earth-based receiver stations, some of which employ highly-stable H-maser atomic clocks. High-accuracy timing data is available for almost two decades. By analyzing the satellite and terrestrial atomic clock data, it is possible to search for transient signatures of exotic physics, such as "clumpy" dark matter and dark energy, effectively transforming the GPS constellation into a 50 000 km aperture sensor array. Here we characterize the noise of the GPS satellite atomic clocks, describe the search method based on Bayesian statistics, and test the method using simulated clock data. We present the projected discovery reach using our method, and demonstrate that it can surpass the existing constrains by several order of magnitude for certain models. Our method is not limited in scope to GPS or atomic clock networks, and can also be applied to other networks of precision measurement devices.
Is generic physical activity or specific exercise associated with motor abilities?
Rinne, Marjo; Pasanen, Matti; Miilunpalo, Seppo; Mälkiä, Esko
2010-09-01
Evidence of the effect of leisure time physical activity (LTPA) modes on the motor abilities of a mature population is scarce. The purpose of this study was to compare the motor abilities of physically active and inactive men and women and to examine the associations of different exercise modes and former and recent LTPA (R-LTPA) with motor ability and various physical tests. The LTPA of the participants (men n = 69, women n = 79; aged 41-47 yr) was ascertained by a modified Physical Activity Readiness Questionnaire, including questions on the frequency, duration, and intensity of R-LTPA and former LTPA and on exercise modes. Motor abilities in terms of balance, agility, and coordination were assessed with a battery of nine tests supplemented with five physical fitness tests. Multiple statistical methods were used in analyses that were conducted separately for men and women. The MET-hours per week of R-LTPA correlated statistically significantly with the tests of agility and static balance (rs = -0.28, P = 0.022; rs = -0.25, P = 0.043, respectively) among men and with the static balance (rs = 0.41), 2-km walking (rs = 0.36), step squat (rs = 0.36) (P < or = 0.001, respectively), and static back endurance (rs = 0.25, P = 0.024) among women. In the stepwise regression among men, the most frequent statistically significant predictor was the playing of several games. For women, a history of LTPA for more than 3 yr was the strongest predictor for good results in almost all tests. Participants with long-term and regular LTPA had better motor performance, and especially a variety of games improve components of motor ability. Diverse, regular, and long-term exercise including both specific training and general activity develops both motor abilities and physical fitness.
A Bayesian approach for parameter estimation and prediction using a computationally intensive model
Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...
2015-02-05
Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less
What can we learn from noise? — Mesoscopic nonequilibrium statistical physics —
KOBAYASHI, Kensuke
2016-01-01
Mesoscopic systems — small electric circuits working in quantum regime — offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics. PMID:27477456
What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.
Kobayashi, Kensuke
2016-01-01
Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.
Modelling 1-minute directional observations of the global irradiance.
NASA Astrophysics Data System (ADS)
Thejll, Peter; Pagh Nielsen, Kristian; Andersen, Elsa; Furbo, Simon
2016-04-01
Direct and diffuse irradiances from the sky has been collected at 1-minute intervals for about a year from the experimental station at the Technical University of Denmark for the IEA project "Solar Resource Assessment and Forecasting". These data were gathered by pyrheliometers tracking the Sun, as well as with apertured pyranometers gathering 1/8th and 1/16th of the light from the sky in 45 degree azimuthal ranges pointed around the compass. The data are gathered in order to develop detailed models of the potentially available solar energy and its variations at high temporal resolution in order to gain a more detailed understanding of the solar resource. This is important for a better understanding of the sub-grid scale cloud variation that cannot be resolved with climate and weather models. It is also important for optimizing the operation of active solar energy systems such as photovoltaic plants and thermal solar collector arrays, and for passive solar energy and lighting to buildings. We present regression-based modelling of the observed data, and focus, here, on the statistical properties of the model fits. Using models based on the one hand on what is found in the literature and on physical expectations, and on the other hand on purely statistical models, we find solutions that can explain up to 90% of the variance in global radiation. The models leaning on physical insights include terms for the direct solar radiation, a term for the circum-solar radiation, a diffuse term and a term for the horizon brightening/darkening. The purely statistical model is found using data- and formula-validation approaches picking model expressions from a general catalogue of possible formulae. The method allows nesting of expressions, and the results found are dependent on and heavily constrained by the cross-validation carried out on statistically independent testing and training data-sets. Slightly better fits -- in terms of variance explained -- is found using the purely statistical fitting/searching approach. We describe the methods applied, results found, and discuss the different potentials of the physics- and statistics-only based model-searches.
Quantum Mechanics and the Principle of Least Radix Economy
NASA Astrophysics Data System (ADS)
Garcia-Morales, Vladimir
2015-03-01
A new variational method, the principle of least radix economy, is formulated. The mathematical and physical relevance of the radix economy, also called digit capacity, is established, showing how physical laws can be derived from this concept in a unified way. The principle reinterprets and generalizes the principle of least action yielding two classes of physical solutions: least action paths and quantum wavefunctions. A new physical foundation of the Hilbert space of quantum mechanics is then accomplished and it is used to derive the Schrödinger and Dirac equations and the breaking of the commutativity of spacetime geometry. The formulation provides an explanation of how determinism and random statistical behavior coexist in spacetime and a framework is developed that allows dynamical processes to be formulated in terms of chains of digits. These methods lead to a new (pre-geometrical) foundation for Lorentz transformations and special relativity. The Parker-Rhodes combinatorial hierarchy is encompassed within our approach and this leads to an estimate of the interaction strength of the electromagnetic and gravitational forces that agrees with the experimental values to an error of less than one thousandth. Finally, it is shown how the principle of least-radix economy naturally gives rise to Boltzmann's principle of classical statistical thermodynamics. A new expression for a general (path-dependent) nonequilibrium entropy is proposed satisfying the Second Law of Thermodynamics.
NASA Technical Reports Server (NTRS)
1981-01-01
The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.
Statistical Mechanics of Combinatorial Auctions
NASA Astrophysics Data System (ADS)
Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo
2006-09-01
Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.
Eye on the Ball: Table Tennis as a Pro-Health Form of Leisure-Time Physical Activity
Buchholtz, Sonia; Krzepota, Justyna
2018-01-01
Background: The article is devoted to an analysis of leisure-time (amateur) table tennis in Poland, its practitioners and the regularities of their activity. Methods: The study examined 12,406 persons in 4689 households (representative for the population). We used binary logistic regression and descriptive statistics in order to identify the patterns and determinants of table-tennis practice in Poland. Results: Table tennis is practised by 2.8% of population, and by 6.6% of physically active Poles. Among adults it is predominantly an occasional recreational game, not performed as a sport per se. Among children, it is often the part of physical education (PE) classes. Statistically significant predictors of contact with table tennis are: gender, age, income, place of residence, children in the household and being a student. Conclusions: Due to the undeniable benefits of table tennis (health, pleasure, personal and social development), the sport is recommended for use as a tool in increasing the (overall low) physical activity of Poles. Its popularization requires promotion in the media (as a health-oriented activity) and using various channels, including public places, the workplace (as part of corporate social responsibility) and physical education classes at school. PMID:29649160
Eye on the Ball: Table Tennis as a Pro-Health Form of Leisure-Time Physical Activity.
Biernat, Elżbieta; Buchholtz, Sonia; Krzepota, Justyna
2018-04-12
Background: The article is devoted to an analysis of leisure-time (amateur) table tennis in Poland, its practitioners and the regularities of their activity. Methods: The study examined 12,406 persons in 4689 households (representative for the population). We used binary logistic regression and descriptive statistics in order to identify the patterns and determinants of table-tennis practice in Poland. Results: Table tennis is practised by 2.8% of population, and by 6.6% of physically active Poles. Among adults it is predominantly an occasional recreational game, not performed as a sport per se. Among children, it is often the part of physical education (PE) classes. Statistically significant predictors of contact with table tennis are: gender, age, income, place of residence, children in the household and being a student. Conclusions: Due to the undeniable benefits of table tennis (health, pleasure, personal and social development), the sport is recommended for use as a tool in increasing the (overall low) physical activity of Poles. Its popularization requires promotion in the media (as a health-oriented activity) and using various channels, including public places, the workplace (as part of corporate social responsibility) and physical education classes at school.
2016-10-31
statistical physics. Sec. IV includes several examples of the application of the stochastic method, including matching of a shape to a fixed design, and...an important part of any future application of this method. Second, re-initialization of the level set can lead to small but significant movements of...of engineering design problems [6, 17]. However, many of the relevant applications involve non-convex optimisation problems with multiple locally
NASA Astrophysics Data System (ADS)
Ibrahim, Hyatt Abdelhaleem
The effect of Guided Constructivism (Interactivity-Based Learning Environment) and Traditional Expository instructional methods on students' misconceptions about concepts of Newtonian Physics was investigated. Four groups of 79 of University of Central Florida students enrolled in Physics 2048 participated in the study. A quasi-experimental design of nonrandomized, nonequivalent control and experimental groups was employed. The experimental group was exposed to the Guided Constructivist teaching method, while the control group was taught using the Traditional Expository teaching approach. The data collection instruments included the Force Concept Inventory Test (FCI), the Mechanics Baseline Test (MBT), and the Maryland Physics Expectation Survey (MPEX). The Guided Constructivist group had significantly higher means than the Traditional Expository group on the criterion variables of: (1) conceptions of Newtonian Physics, (2) achievement in Newtonian Physics, and (3) beliefs about the content of Physics knowledge, beliefs about the role of Mathematics in learning Physics, and overall beliefs about learning/teaching/appropriate roles of learners and teachers/nature of Physics. Further, significant relationships were found between (1) achievement, conceptual structures, beliefs about the content of Physics knowledge, and beliefs about the role of Mathematics in learning Physics; (2) changes in misconceptions about the physical phenomena, and changes in beliefs about the content of Physics knowledge. No statistically significant difference was found between the two teaching methods on achievement of males and females. These findings suggest that differences in conceptual learning due to the nature of the teaching method used exist. Furthermore, greater conceptual learning is fostered when teachers use interactivity-based teaching strategies to train students to link everyday experience in the real physical world to formal school concepts. The moderate effect size and power of the study suggest that the effect may not be subtle, but reliable. Physics teachers can use these results to inform their decisions about structuring learning environment when conceptual learning is important.
Complex network problems in physics, computer science and biology
NASA Astrophysics Data System (ADS)
Cojocaru, Radu Ionut
There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe lattice at zero temperature and then we apply this formalism to the K-SAT problem defined in Chapter 1. The phase transition which physicists study often corresponds to a change in the computational complexity of the corresponding computer science problem. Chapter 3 presents phase transitions which are specific to the problems discussed in Chapter 1 and also known results for the K-SAT problem. We discuss the replica method and experimental evidences of replica symmetry breaking. The physics approach to hard problems is based on replica methods which are difficult to understand. In Chapter 4 we develop novel methods for studying hard problems using methods similar to the message passing techniques that were discussed in Chapter 2. Although we concentrated on the symmetric case, cavity methods show promise for generalizing our methods to the un-symmetric case. As has been highlighted by John Hopfield, several key features of biological systems are not shared by physical systems. Although living entities follow the laws of physics and chemistry, the fact that organisms adapt and reproduce introduces an essential ingredient that is missing in the physical sciences. In order to extract information from networks many algorithm have been developed. In Chapter 5 we apply polynomial algorithms like minimum spanning tree in order to study and construct gene regulatory networks from experimental data. As future work we propose the use of algorithms like min-cut/max-flow and Dijkstra for understanding key properties of these networks.
Daikoku, Tatsuya; Takahashi, Yuji; Futagami, Hiroko; Tarumoto, Nagayoshi; Yasuda, Hideki
2017-02-01
In real-world auditory environments, humans are exposed to overlapping auditory information such as those made by human voices and musical instruments even during routine physical activities such as walking and cycling. The present study investigated how concurrent physical exercise affects performance of incidental and intentional learning of overlapping auditory streams, and whether physical fitness modulates the performances of learning. Participants were grouped with 11 participants with lower and higher fitness each, based on their Vo 2 max value. They were presented simultaneous auditory sequences with a distinct statistical regularity each other (i.e. statistical learning), while they were pedaling on the bike and seating on a bike at rest. In experiment 1, they were instructed to attend to one of the two sequences and ignore to the other sequence. In experiment 2, they were instructed to attend to both of the two sequences. After exposure to the sequences, learning effects were evaluated by familiarity test. In the experiment 1, performance of statistical learning of ignored sequences during concurrent pedaling could be higher in the participants with high than low physical fitness, whereas in attended sequence, there was no significant difference in performance of statistical learning between high than low physical fitness. Furthermore, there was no significant effect of physical fitness on learning while resting. In the experiment 2, the both participants with high and low physical fitness could perform intentional statistical learning of two simultaneous sequences in the both exercise and rest sessions. The improvement in physical fitness might facilitate incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.
Classification without labels: learning from mixed samples in high energy physics
NASA Astrophysics Data System (ADS)
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
2017-10-01
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimal classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.
Impact of work environment and work-related stress on turnover intention in physical therapists.
Lee, Byoung-Kwon; Seo, Dong-Kwon; Lee, Jang-Tae; Lee, A-Ram; Jeon, Ha-Neul; Han, Dong-Uk
2016-08-01
[Purpose] This study was conducted to provide basic data for solutions to reduce the turnover rate of physical therapists. It should help create efficient personnel and organization management by exploring the impact of the work environment and work-related stress on turnover intention and analyzing the correlation between them. [Subjects and Methods] A survey was conducted with 236 physical therapists working at medical institutions in the Daejeon and Chungcheong areas. For the analysis on the collected data, correlational and linear regression analyses were conducted using the SPSS 18.0 program and Cronbach's alpha coefficient. [Results] The results showed a statistically significant positive correlation between turnover intention and work-related stress but a statistically significant negative correlation respectively between turnover intention and work environment. Work-related stress (β=0.415) had a significant positive impact on turnover intention and work environment (β=-0.387) had a significant negative impact on turnover intention. [Conclusion] To increase satisfaction level with the profession as well as the workplace for physical therapists, improvement of the work environment was the most necessary primary improvement.
Classification without labels: learning from mixed samples in high energy physics
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
2017-10-25
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less
Impact of work environment and work-related stress on turnover intention in physical therapists
Lee, Byoung-kwon; Seo, Dong-kwon; Lee, Jang-Tae; Lee, A-Ram; Jeon, Ha-Neul; Han, Dong-Uk
2016-01-01
[Purpose] This study was conducted to provide basic data for solutions to reduce the turnover rate of physical therapists. It should help create efficient personnel and organization management by exploring the impact of the work environment and work-related stress on turnover intention and analyzing the correlation between them. [Subjects and Methods] A survey was conducted with 236 physical therapists working at medical institutions in the Daejeon and Chungcheong areas. For the analysis on the collected data, correlational and linear regression analyses were conducted using the SPSS 18.0 program and Cronbach’s alpha coefficient. [Results] The results showed a statistically significant positive correlation between turnover intention and work-related stress but a statistically significant negative correlation respectively between turnover intention and work environment. Work-related stress (β=0.415) had a significant positive impact on turnover intention and work environment (β=−0.387) had a significant negative impact on turnover intention. [Conclusion] To increase satisfaction level with the profession as well as the workplace for physical therapists, improvement of the work environment was the most necessary primary improvement. PMID:27630432
Classification without labels: learning from mixed samples in high energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less
The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach
NASA Astrophysics Data System (ADS)
Sari, S. Y.; Afrizon, R.
2018-04-01
Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.
Teachers' approaches to teaching physics
NASA Astrophysics Data System (ADS)
2012-12-01
Benjamin Franklin said, "Tell me, and I forget. Teach me, and I remember. Involve me, and I learn." He would not be surprised to learn that research in physics pedagogy has consistently shown that the traditional lecture is the least effective teaching method for teaching physics. We asked high school physics teachers which teaching activities they used in their classrooms. While almost all teachers still lecture sometimes, two-thirds use something other than lecture most of the time. The five most often-used activities are shown in the table below. In the January issue, we will look at the 2013 Nationwide Survey of High School Physics teachers. Susan White is Research Manager in the Statistical Research Center at the American Institute of Physics; she directs the Nationwide Survey of High School Physics Teachers. If you have any questions, please contact Susan at swhite@aip.org.
Petruseviciene, Daiva; Krisciūnas, Aleksandras; Sameniene, Jūrate
2002-01-01
In this article we analyze influence of rehabilitation methods in treatment of arm lymphedema. In Kaunas oncological hospital were examined 60 women after surgery for breast cancer. The work objective was to evaluate efficiency of rehabilitation methods in treatment of arm lymphedema and in evaluate movement amplitude of shoulder joint. Two groups of women depending on rehabilitation start were evaluated. The same methods of rehabilitation were applied to both groups: physical therapy, electrostimulation, massage, lymphodrainage with apparate. Our study indicated that women, who were treated at early period of rehabilitation (3 months), showed statistically significantly (p < 0.01) better results in increase of movement amplitude of shoulder joint. However, results of treatment of arm lymphedema, comparing with women who started rehabilitation after 12 months, were equally successful--results were not statistically significantly better (p > 0.05).
NASA Astrophysics Data System (ADS)
Zhang, Weijia; Fuller, Robert G.
1998-05-01
A demographic database for the 139 Nobel prize winners in physics from 1901 to 1990 has been created from a variety of sources. The results of our statistical study are discussed in the light of the implications for physics teaching.
Active Learning in the Physics Classroom
NASA Astrophysics Data System (ADS)
Naron, Carol
Many students enter physics classes filled with misconceptions about physics concepts. Students tend to retain these misconceptions into their adult lives, even after physics instruction. Constructivist researchers have found that students gain understanding through their experiences. Researchers have also found that active learning practices increase conceptual understanding of introductory physics students. This project study sought to examine whether incorporating active learning practices in an advanced placement physics classroom increased conceptual understanding as measured by the force concept inventory (FCI). Physics students at the study site were given the FCI as both a pre- and posttest. Test data were analyzed using two different methods---a repeated-measures t test and the Hake gain method. The results of this research project showed that test score gains were statistically significant, as measured by the t test. The Hake gain results indicated a low (22.5%) gain for the class. The resulting project was a curriculum plan for teaching the mechanics portion of Advanced Placement (AP) physics B as well as several active learning classroom practices supported by the research. This project will allow AP physics teachers an opportunity to improve their curricular practices. Locally, the results of this project study showed that research participants gained understanding of physics concepts. Social change may occur as teachers implement active learning strategies, thus creating improved student understanding of physics concepts.
Advanced quantitative measurement methodology in physics education research
NASA Astrophysics Data System (ADS)
Wang, Jing
The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-08
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al . 2012 Proc. R. Soc. A 468 , 1799-1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi-Dirac or Bose-Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas.
Violence against Educated Women by Intimate Partners in Urban Karnataka, India
Kundapur, Rashmi; Shetty, Shruthi M.; Kempaller, Vinayak J.; Kumar, Ashwini; Anurupa, M.
2017-01-01
Background: Initially viewed as a human rights issue, partner violence is increasingly seen as an important public health problem of international concern. Objectives: To assess the extent of physical, sexual, psychological, and controlling behavior of intimate partners against women in an educated society and find the association with age, age of marriage, married years, educational status of the women and that of partner. Materials and Methods: A prevalence of 15% was taken and final sample was 200, after considering loss of follow-up. Statistical Methods: Proportion, Z-test, Chi-square test. Results: The prevalence of violence against intimate partner in educated society was found to be 40.5% in a South Indian city. Physical assault was high in 30–50 years and increased with duration of marriage from 5.5% at 5 years to 33.3% in 10–20 years of married life. Sexual and psychological assault also increased in <5 years of married life to 35% and 47.6% in 10–20 years duration of marriage, which was statistically significant. Sexual and psychological assault showed a bimodal presentation. Less educated women and their partners were found to report more violence, which was statistically significant. Conclusion: Violence against women is not uncommon in the educated society. PMID:28852277
Rand, R.S.; Clark, R.N.; Livo, K.E.
2011-01-01
The Deepwater Horizon oil spill covered a very large geographical area in the Gulf of Mexico creating potentially serious environmental impacts on both marine life and the coastal shorelines. Knowing the oil's areal extent and thickness as well as denoting different categories of the oil's physical state is important for assessing these impacts. High spectral resolution data in hyperspectral imagery (HSI) sensors such as Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) provide a valuable source of information that can be used for analysis by semi-automatic methods for tracking an oil spill's areal extent, oil thickness, and oil categories. However, the spectral behavior of oil in water is inherently a highly non-linear and variable phenomenon that changes depending on oil thickness and oil/water ratios. For certain oil thicknesses there are well-defined absorption features, whereas for very thin films sometimes there are almost no observable features. Feature-based imaging spectroscopy methods are particularly effective at classifying materials that exhibit specific well-defined spectral absorption features. Statistical methods are effective at classifying materials with spectra that exhibit a considerable amount of variability and that do not necessarily exhibit well-defined spectral absorption features. This study investigates feature-based and statistical methods for analyzing oil spills using hyperspectral imagery. The appropriate use of each approach is investigated and a combined feature-based and statistical method is proposed.
A Localized Ensemble Kalman Smoother
NASA Technical Reports Server (NTRS)
Butala, Mark D.
2012-01-01
Numerous geophysical inverse problems prove difficult because the available measurements are indirectly related to the underlying unknown dynamic state and the physics governing the system may involve imperfect models or unobserved parameters. Data assimilation addresses these difficulties by combining the measurements and physical knowledge. The main challenge in such problems usually involves their high dimensionality and the standard statistical methods prove computationally intractable. This paper develops and addresses the theoretical convergence of a new high-dimensional Monte-Carlo approach called the localized ensemble Kalman smoother.
Introducing Students to Plant Geography: Polar Ordination Applied to Hanging Gardens.
ERIC Educational Resources Information Center
Malanson, George P.; And Others
1993-01-01
Reports on a research study in which college students used a statistical ordination method to reveal relationships among plant community structures and physical, disturbance, and spatial variables. Concludes that polar ordination helps students understand the methodology of plant geography and encourages further student research. (CFR)
McIver, Kerry L.; Brown, William H.; Pfeiffer, Karin A.; Dowda, Marsha; Pate, Russell R.
2016-01-01
Purpose This study describes the development and pilot testing of the Observational System for Recording Physical Activity-Elementary School (OSRAC-E) version. Methods This system was developed to observe and document the levels and types of physical activity and physical and social contexts of physical activity in elementary school students during the school day. Inter-observer agreement scores and summary data were calculated. Results All categories had Kappa statistics above 0.80, with the exception of the activity initiator category. Inter-observer agreement scores were 96% or greater. The OSRAC-E was shown to be a reliable observation system that allows researchers to assess physical activity behaviors, the contexts of those behaviors, and the effectiveness of physical activity interventions in the school environment. Conclusion The OSRAC-E can yield data with high interobserver reliability and provide relatively extensive contextual information about physical activity of students in elementary schools. PMID:26889587
[Physical activity, obesity and self-esteem in chilean schoolchildren].
Zurita-Ortega, Félix; Castro-Sánchez, Manuel; Rodríguez-Fernández, Sonia; Cofré-Boladós, Cristian; Chacón-Cuberos, Ramón; Martínez-Martínez, Asunción; Muros-Molina, José Joaquín
2017-03-01
Obesity is a worldwide epidemic disease and a problem for the Chilean society. To analyze the relationship between physical condition, body mass index (BMI), level of physical activity and self-esteem. Material ad Methods: Questionnaires to assess self-esteem (Rosemberg scale) and levels of physical activity (Physical Activity Questionnaire for older Children, PAQ-C) were answered by 515 children aged 10.5 ± 0.5 years from 27 schools of Santiago de Chile. BMI was calculated. Course-Navette test was carried out, vertical jump and hand dynamometry were measured. For statistical analysis, structural equations were used. An acceptable goodness of fit for the models was found. There was a positive relationship between BMI and hand dynamometry, as well as a negative relationship between BMI and maximal oxygen consumption, jumping ability, physical activity and self-esteem. Finally, self-esteem was positively related to physical activity engagement. In these children, self-esteem was related to physical activity variables.
ecode - Electron Transport Algorithm Testing v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franke, Brian C.; Olson, Aaron J.; Bruss, Donald Eugene
2016-10-05
ecode is a Monte Carlo code used for testing algorithms related to electron transport. The code can read basic physics parameters, such as energy-dependent stopping powers and screening parameters. The code permits simple planar geometries of slabs or cubes. Parallelization consists of domain replication, with work distributed at the start of the calculation and statistical results gathered at the end of the calculation. Some basic routines (such as input parsing, random number generation, and statistics processing) are shared with the Integrated Tiger Series codes. A variety of algorithms for uncertainty propagation are incorporated based on the stochastic collocation and stochasticmore » Galerkin methods. These permit uncertainty only in the total and angular scattering cross sections. The code contains algorithms for simulating stochastic mixtures of two materials. The physics is approximate, ranging from mono-energetic and isotropic scattering to screened Rutherford angular scattering and Rutherford energy-loss scattering (simple electron transport models). No production of secondary particles is implemented, and no photon physics is implemented.« less
Physics First: Impact on SAT Math Scores
NASA Astrophysics Data System (ADS)
Bouma, Craig E.
Improving science, technology, engineering, and mathematics (STEM) education has become a national priority and the call to modernize secondary science has been heard. A Physics First (PF) program with the curriculum sequence of physics, chemistry, and biology (PCB) driven by inquiry- and project-based learning offers a viable alternative to the traditional curricular sequence (BCP) and methods of teaching, but requires more empirical evidence. This study determined impact of a PF program (PF-PCB) on math achievement (SAT math scores) after the first two cohorts of students completed the PF-PCB program at Matteo Ricci High School (MRHS) and provided more quantitative data to inform the PF debate and advance secondary science education. Statistical analysis (ANCOVA) determined the influence of covariates and revealed that PF-PCB program had a significant (p < .05) impact on SAT math scores in the second cohort at MRHS. Statistically adjusted, the SAT math means for PF students were 21.4 points higher than their non-PF counterparts when controlling for prior math achievement (HSTP math), socioeconomic status (SES), and ethnicity/race.
Equilibrium statistical-thermal models in high-energy physics
NASA Astrophysics Data System (ADS)
Tawfik, Abdel Nasser
2014-05-01
We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical-thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.
Multivariate Statistical Analysis of Cigarette Design Feature Influence on ISO TNCO Yields.
Agnew-Heard, Kimberly A; Lancaster, Vicki A; Bravo, Roberto; Watson, Clifford; Walters, Matthew J; Holman, Matthew R
2016-06-20
The aim of this study is to explore how differences in cigarette physical design parameters influence tar, nicotine, and carbon monoxide (TNCO) yields in mainstream smoke (MSS) using the International Organization of Standardization (ISO) smoking regimen. Standardized smoking methods were used to evaluate 50 U.S. domestic brand cigarettes and a reference cigarette representing a range of TNCO yields in MSS collected from linear smoking machines using a nonintense smoking regimen. Multivariate statistical methods were used to form clusters of cigarettes based on their ISO TNCO yields and then to explore the relationship between the ISO generated TNCO yields and the nine cigarette physical design parameters between and within each cluster simultaneously. The ISO generated TNCO yields in MSS are 1.1-17.0 mg tar/cigarette, 0.1-2.2 mg nicotine/cigarette, and 1.6-17.3 mg CO/cigarette. Cluster analysis divided the 51 cigarettes into five discrete clusters based on their ISO TNCO yields. No one physical parameter dominated across all clusters. Predicting ISO machine generated TNCO yields based on these nine physical design parameters is complex due to the correlation among and between the nine physical design parameters and TNCO yields. From these analyses, it is estimated that approximately 20% of the variability in the ISO generated TNCO yields comes from other parameters (e.g., filter material, filter type, inclusion of expanded or reconstituted tobacco, and tobacco blend composition, along with differences in tobacco leaf origin and stalk positions and added ingredients). A future article will examine the influence of these physical design parameters on TNCO yields under a Canadian Intense (CI) smoking regimen. Together, these papers will provide a more robust picture of the design features that contribute to TNCO exposure across the range of real world smoking patterns.
Assessment of Electronic Government Information Products
1999-03-30
Center for Environmental Info. & Statistics CM Consumer Handbook for Reducing Solid Waste CM Envirofacts Warehouse CM EPA Online Library System (OLS) CP...Hazardous Waste Site Query (CERCLIS Data) CM Surf Your Watershed CM Test Methods for Evaluating Solid Waste : Physical/Chemical Methods (SW-846) CM...SGML because they consider it "intelligent data" that can automatically generate other formats (e.g., web, BBS, Fax on Demand) through templates and
WE-A-201-02: Modern Statistical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niemierko, A.
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
WE-A-201-00: Anne and Donald Herbert Distinguished Lectureship On Modern Statistical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
Vieira, Rute; McDonald, Suzanne; Araújo-Soares, Vera; Sniehotta, Falko F; Henderson, Robin
2017-09-01
N-of-1 studies are based on repeated observations within an individual or unit over time and are acknowledged as an important research method for generating scientific evidence about the health or behaviour of an individual. Statistical analyses of n-of-1 data require accurate modelling of the outcome while accounting for its distribution, time-related trend and error structures (e.g., autocorrelation) as well as reporting readily usable contextualised effect sizes for decision-making. A number of statistical approaches have been documented but no consensus exists on which method is most appropriate for which type of n-of-1 design. We discuss the statistical considerations for analysing n-of-1 studies and briefly review some currently used methodologies. We describe dynamic regression modelling as a flexible and powerful approach, adaptable to different types of outcomes and capable of dealing with the different challenges inherent to n-of-1 statistical modelling. Dynamic modelling borrows ideas from longitudinal and event history methodologies which explicitly incorporate the role of time and the influence of past on future. We also present an illustrative example of the use of dynamic regression on monitoring physical activity during the retirement transition. Dynamic modelling has the potential to expand researchers' access to robust and user-friendly statistical methods for individualised studies.
Visell, Yon
2015-04-01
This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.
NASA Astrophysics Data System (ADS)
Singh Pradhan, Ananta Man; Kang, Hyo-Sub; Kim, Yun-Tae
2016-04-01
This study uses a physically based approach to evaluate the factor of safety of the hillslope for different hydrological conditions, in Mt Umyeon, south of Seoul. The hydrological conditions were determined using intensity and duration of whole Korea of known landslide inventory data. Quantile regression statistical method was used to ascertain different probability warning levels on the basis of rainfall thresholds. Physically based models are easily interpreted and have high predictive capabilities but rely on spatially explicit and accurate parameterization, which is commonly not possible. Statistical probabilistic methods can include other causative factors which influence the slope stability such as forest, soil and geology, but rely on good landslide inventories of the site. In this study a hybrid approach has described that combines the physically-based landslide susceptibility for different hydrological conditions. A presence-only based maximum entropy model was used to hybrid and analyze relation of landslide with conditioning factors. About 80% of the landslides were listed among the unstable sites identified in the proposed model, thereby presenting its effectiveness and accuracy in determining unstable areas and areas that require evacuation. These cumulative rainfall thresholds provide a valuable reference to guide disaster prevention authorities in the issuance of warning levels with the ability to reduce losses and save lives.
NASA Astrophysics Data System (ADS)
Mercer, Gary J.
This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Estimation of Global Network Statistics from Incomplete Data
Bliss, Catherine A.; Danforth, Christopher M.; Dodds, Peter Sheridan
2014-01-01
Complex networks underlie an enormous variety of social, biological, physical, and virtual systems. A profound complication for the science of complex networks is that in most cases, observing all nodes and all network interactions is impossible. Previous work addressing the impacts of partial network data is surprisingly limited, focuses primarily on missing nodes, and suggests that network statistics derived from subsampled data are not suitable estimators for the same network statistics describing the overall network topology. We generate scaling methods to predict true network statistics, including the degree distribution, from only partial knowledge of nodes, links, or weights. Our methods are transparent and do not assume a known generating process for the network, thus enabling prediction of network statistics for a wide variety of applications. We validate analytical results on four simulated network classes and empirical data sets of various sizes. We perform subsampling experiments by varying proportions of sampled data and demonstrate that our scaling methods can provide very good estimates of true network statistics while acknowledging limits. Lastly, we apply our techniques to a set of rich and evolving large-scale social networks, Twitter reply networks. Based on 100 million tweets, we use our scaling techniques to propose a statistical characterization of the Twitter Interactome from September 2008 to November 2008. Our treatment allows us to find support for Dunbar's hypothesis in detecting an upper threshold for the number of active social contacts that individuals maintain over the course of one week. PMID:25338183
Statistical variances of diffusional properties from ab initio molecular dynamics simulations
NASA Astrophysics Data System (ADS)
He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei
2018-12-01
Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.
NASA Astrophysics Data System (ADS)
Gillam, Thomas P. S.; Lester, Christopher G.
2014-11-01
We consider current and alternative approaches to setting limits on new physics signals having backgrounds from misidentified objects; for example jets misidentified as leptons, b-jets or photons. Many ATLAS and CMS analyses have used a heuristic "matrix method" for estimating the background contribution from such sources. We demonstrate that the matrix method suffers from statistical shortcomings that can adversely affect its ability to set robust limits. A rigorous alternative method is discussed, and is seen to produce fake rate estimates and limits with better qualities, but is found to be too costly to use. Having investigated the nature of the approximations used to derive the matrix method, we propose a third strategy that is seen to marry the speed of the matrix method to the performance and physicality of the more rigorous approach.
Kate, Rohit J.; Swartz, Ann M.; Welch, Whitney A.; Strath, Scott J.
2016-01-01
Wearable accelerometers can be used to objectively assess physical activity. However, the accuracy of this assessment depends on the underlying method used to process the time series data obtained from accelerometers. Several methods have been proposed that use this data to identify the type of physical activity and estimate its energy cost. Most of the newer methods employ some machine learning technique along with suitable features to represent the time series data. This paper experimentally compares several of these techniques and features on a large dataset of 146 subjects doing eight different physical activities wearing an accelerometer on the hip. Besides features based on statistics, distance based features and simple discrete features straight from the time series were also evaluated. On the physical activity type identification task, the results show that using more features significantly improve results. Choice of machine learning technique was also found to be important. However, on the energy cost estimation task, choice of features and machine learning technique were found to be less influential. On that task, separate energy cost estimation models trained specifically for each type of physical activity were found to be more accurate than a single model trained for all types of physical activities. PMID:26862679
Learning style and teaching method preferences of Saudi students of physical therapy
Al Maghraby, Mohamed A.; Alshami, Ali M.
2013-01-01
Context: To the researchers’ knowledge, there are no published studies that have investigated the learning styles and preferred teaching methods of physical therapy students in Saudi Arabia. Aim: The study was conducted to determine the learning styles and preferred teaching methods of Saudi physical therapy students. Settings and Design: A cross-sectional study design. Materials and Methods: Fifty-three Saudis studying physical therapy (21 males and 32 females) participated in the study. The principal researcher gave an introductory lecture to explain the different learning styles and common teaching methods. Upon completion of the lecture, questionnaires were distributed, and were collected on completion. Statistical Analysis Used: Percentages were calculated for the learning styles and teaching methods. Pearson’s correlations were performed to investigate the relationship between them. Results: More than 45 (85%) of the students rated hands-on training as the most preferred teaching method. Approximately 30 (57%) students rated the following teaching methods as the most preferred methods: “Advanced organizers,” “demonstrations,” and “multimedia activities.” Although 31 (59%) students rated the concrete-sequential learning style the most preferred, these students demonstrated mixed styles on the other style dimensions: Abstract-sequential, abstract-random, and concrete-random. Conclusions: The predominant concrete-sequential learning style is consistent with the most preferred teaching method (hands-on training). The high percentage of physical therapy students whose responses were indicative of mixed learning styles suggests that they can accommodate multiple teaching methods. It is recommended that educators consider the diverse learning styles of the students and utilize a variety of teaching methods in order to promote an optimal learning environment for the students. PMID:24672278
Profile of Prospective Physics Teachers on Assessment Literacy
NASA Astrophysics Data System (ADS)
Efendi, R.; Rustaman, N. Y.; Kaniawati, I.
2017-02-01
A study about assessment literacy of prospective Physics teachers was conducted with the involvement of 45 prospective physics teachers. Data collected by using test consisted of seven competencies. The profile of prospective physics teachers on assessment literacy determined in descriptive statistics, in the form of respondent average values. Research finding shows that prospective physics teachers were weak at all competency areas. The average values of the Choosing assessment methods appropriate for instructional decisions is the highest average values and the average values of the communicating assessment results to students, parents, other lay audiences, and other educators is the lowest average values. In depth study to detect the reason underlined the results was still in progress so far, as another aspect was planned to be administered on the next semester.
PREFACE: New trends in Computer Simulations in Physics and not only in physics
NASA Astrophysics Data System (ADS)
Shchur, Lev N.; Krashakov, Serge A.
2016-02-01
In this volume we have collected papers based on the presentations given at the International Conference on Computer Simulations in Physics and beyond (CSP2015), held in Moscow, September 6-10, 2015. We hope that this volume will be helpful and scientifically interesting for readers. The Conference was organized for the first time with the common efforts of the Moscow Institute for Electronics and Mathematics (MIEM) of the National Research University Higher School of Economics, the Landau Institute for Theoretical Physics, and the Science Center in Chernogolovka. The name of the Conference emphasizes the multidisciplinary nature of computational physics. Its methods are applied to the broad range of current research in science and society. The choice of venue was motivated by the multidisciplinary character of the MIEM. It is a former independent university, which has recently become the part of the National Research University Higher School of Economics. The Conference Computer Simulations in Physics and beyond (CSP) is planned to be organized biannually. This year's Conference featured 99 presentations, including 21 plenary and invited talks ranging from the analysis of Irish myths with recent methods of statistical physics, to computing with novel quantum computers D-Wave and D-Wave2. This volume covers various areas of computational physics and emerging subjects within the computational physics community. Each section was preceded by invited talks presenting the latest algorithms and methods in computational physics, as well as new scientific results. Both parallel and poster sessions paid special attention to numerical methods, applications and results. For all the abstracts presented at the conference please follow the link http://csp2015.ac.ru/files/book5x.pdf
Benjamin, Sara E; Neelon, Brian; Ball, Sarah C; Bangdiwala, Shrikant I; Ammerman, Alice S; Ward, Dianne S
2007-01-01
Background Few assessment instruments have examined the nutrition and physical activity environments in child care, and none are self-administered. Given the emerging focus on child care settings as a target for intervention, a valid and reliable measure of the nutrition and physical activity environment is needed. Methods To measure inter-rater reliability, 59 child care center directors and 109 staff completed the self-assessment concurrently, but independently. Three weeks later, a repeat self-assessment was completed by a sub-sample of 38 directors to assess test-retest reliability. To assess criterion validity, a researcher-administered environmental assessment was conducted at 69 centers and was compared to a self-assessment completed by the director. A weighted kappa test statistic and percent agreement were calculated to assess agreement for each question on the self-assessment. Results For inter-rater reliability, kappa statistics ranged from 0.20 to 1.00 across all questions. Test-retest reliability of the self-assessment yielded kappa statistics that ranged from 0.07 to 1.00. The inter-quartile kappa statistic ranges for inter-rater and test-retest reliability were 0.45 to 0.63 and 0.27 to 0.45, respectively. When percent agreement was calculated, questions ranged from 52.6% to 100% for inter-rater reliability and 34.3% to 100% for test-retest reliability. Kappa statistics for validity ranged from -0.01 to 0.79, with an inter-quartile range of 0.08 to 0.34. Percent agreement for validity ranged from 12.9% to 93.7%. Conclusion This study provides estimates of criterion validity, inter-rater reliability and test-retest reliability for an environmental nutrition and physical activity self-assessment instrument for child care. Results indicate that the self-assessment is a stable and reasonably accurate instrument for use with child care interventions. We therefore recommend the Nutrition and Physical Activity Self-Assessment for Child Care (NAP SACC) instrument to researchers and practitioners interested in conducting healthy weight intervention in child care. However, a more robust, less subjective measure would be more appropriate for researchers seeking an outcome measure to assess intervention impact. PMID:17615078
NASA Astrophysics Data System (ADS)
Hapca, Simona
2015-04-01
Many soil properties and functions emerge from interactions of physical, chemical and biological processes at microscopic scales, which can be understood only by integrating techniques that traditionally are developed within separate disciplines. While recent advances in imaging techniques, such as X-ray computed tomography (X-ray CT), offer the possibility to reconstruct the 3D physical structure at fine resolutions, for the distribution of chemicals in soil, existing methods, based on scanning electron microscope (SEM) and energy dispersive X-ray detection (EDX), allow for characterization of the chemical composition only on 2D surfaces. At present, direct 3D measurement techniques are still lacking, sequential sectioning of soils, followed by 2D mapping of chemical elements and interpolation to 3D, being an alternative which is explored in this study. Specifically, we develop an integrated experimental and theoretical framework which combines 3D X-ray CT imaging technique with 2D SEM-EDX and use spatial statistics methods to map the chemical composition of soil in 3D. The procedure involves three stages 1) scanning a resin impregnated soil cube by X-ray CT, followed by precision cutting to produce parallel thin slices, the surfaces of which are scanned by SEM-EDX, 2) alignment of the 2D chemical maps within the internal 3D structure of the soil cube, and 3) development, of spatial statistics methods to predict the chemical composition of 3D soil based on the observed 2D chemical and 3D physical data. Specifically, three statistical models consisting of a regression tree, a regression tree kriging and cokriging model were used to predict the 3D spatial distribution of carbon, silicon, iron and oxygen in soil, these chemical elements showing a good spatial agreement between the X-ray grayscale intensities and the corresponding 2D SEM-EDX data. Due to the spatial correlation between the physical and chemical data, the regression-tree model showed a great potential in predicting chemical composition in particular for iron, which is generally sparsely distributed in soil. For carbon, silicon and oxygen, which are more densely distributed, the additional kriging of the regression tree residuals improved significantly the prediction, whereas prediction based on co-kriging was less consistent across replicates, underperforming regression-tree kriging. The present study shows a great potential in integrating geo-statistical methods with imaging techniques to unveil the 3D chemical structure of soil at very fine scales, the framework being suitable to be further applied to other types of imaging data such as images of biological thin sections for characterization of microbial distribution. Key words: X-ray CT, SEM-EDX, segmentation techniques, spatial correlation, 3D soil images, 2D chemical maps.
Soderlund, Patricia Davern
2018-03-01
Objectives This review examines the effectiveness of motivational interviewing for physical activity self-management for adults diagnosed with diabetes mellitus type 2. Motivational interviewing is a patient centered individually tailored counseling intervention that aims to elicit a patient's own motivation for health behavior change. Review questions include (a) How have motivational interviewing methods been applied to physical activity interventions for adults with diabetes mellitus type 2? (b) What motivational interviewing approaches are associated with successful physical activity outcomes with diabetes mellitus 2? Methods Database searches used PubMed, CINAHL, and PsycINFO for the years 2000 to 2016. Criteria for inclusion was motivational interviewing used as the principal intervention in the tradition of Miller and Rollnick, measurement of physical activity, statistical significance reported for physical activity outcomes, quantitative research, and articles written in English. Results A total of nine studies met review criteria and four included motivational interviewing interventions associated with significant physical activity outcomes. Discussion Findings suggest motivational interviewing sessions should target a minimal number of self-management behaviors, be delivered by counselors proficient in motivational interviewing, and use motivational interviewing protocols with an emphasis placed either on duration or frequency of sessions.
Dunlay, Shannon M.; Gheorghiade, Mihai; Reid, Kimberly J.; Allen, Larry A.; Chan, Paul S.; Hauptman, Paul J.; Zannad, Faiez; Maggioni, Aldo P.; Swedberg, Karl; Konstam, Marvin A.; Spertus, John A.
2010-01-01
Aims Hospitalized heart failure (HF) patients are at high risk for death and readmission. We examined the incremental value of data obtained 1 week after HF hospital discharge in predicting mortality and readmission. Methods and results In the Efficacy of Vasopressin Antagonism in Heart Failure Outcome Study with tolvaptan, 1528 hospitalized patients (ejection fraction ≤40%) with a physical examination, laboratories, and health status [Kansas City Cardiomyopathy Questionnaire (KCCQ)] assessments 1 week after discharge were included. The ability to predict 1 year cardiovascular rehospitalization and mortality was assessed with Cox models, c-statistics, and the integrated discrimination improvement (IDI). Not using a beta-blocker, rales, pedal oedema, hyponatraemia, lower creatinine clearance, higher brain natriuretic peptide, and worse health status were independent risk factors for rehospitalization and death. The c-statistic for the base model (history and medications) was 0.657. The model improved with physical examination, laboratory, and KCCQ results, with IDI increases of 4.9, 7.0, and 3.2%, respectively (P < 0.001 each). The combination of all three offered the greatest incremental gain (c-statistic 0.749; IDI increase 10.8%). Conclusion Physical examination, laboratories, and KCCQ assessed 1 week after discharge offer important prognostic information, suggesting that all are critical components of outpatient evaluation after HF hospitalization. PMID:20197265
Influence of pilates training on the quality of life of chronic stroke patients.
Yun, Seok-Min; Park, Sang-Kyoon; Lim, Hee Sung
2017-10-01
[Purpose] This study was to observe the influence of Pilates training on the quality of life in chronic stoke patients. [Subjects and Methods] Forty chronic stroke patients participated in this study. They were divided into same number of experimental group (EG) and control group (CG). EG participated in a 60-min Pilates training program, twice a week for 12 weeks, while the CG did not participate in any exercise-related activities for the duration and participating in general occupational therapy without any exercise-related activities. Then the MMSE-K was performed before and after Pilates training to observe the influence of Pilates training on the quality of life in chronic stroke patients. [Results] Statistically significant improvement in the physical, social, and psychological domains was found in EG after the training. No statistically significant difference was found in all three quality of life domains for the CG. EG experienced a statistically significant improvement in all quality of life domains compared with that of CG. [Conclusion] Therefore, participation in Pilates training was found to effectively improve the quality of life in stroke patients. Pilates training involves low and intermediate intensity resistance and repetition that match the patient's physical ability and can be a remedial exercise program that can improve physical ability and influence quality of life.
Influence of pilates training on the quality of life of chronic stroke patients
Yun, Seok-Min; Park, Sang-Kyoon; Lim, Hee Sung
2017-01-01
[Purpose] This study was to observe the influence of Pilates training on the quality of life in chronic stoke patients. [Subjects and Methods] Forty chronic stroke patients participated in this study. They were divided into same number of experimental group (EG) and control group (CG). EG participated in a 60-min Pilates training program, twice a week for 12 weeks, while the CG did not participate in any exercise-related activities for the duration and participating in general occupational therapy without any exercise-related activities. Then the MMSE-K was performed before and after Pilates training to observe the influence of Pilates training on the quality of life in chronic stroke patients. [Results] Statistically significant improvement in the physical, social, and psychological domains was found in EG after the training. No statistically significant difference was found in all three quality of life domains for the CG. EG experienced a statistically significant improvement in all quality of life domains compared with that of CG. [Conclusion] Therefore, participation in Pilates training was found to effectively improve the quality of life in stroke patients. Pilates training involves low and intermediate intensity resistance and repetition that match the patient’s physical ability and can be a remedial exercise program that can improve physical ability and influence quality of life. PMID:29184300
Statistical physics of crime: a review.
D'Orsogna, Maria R; Perc, Matjaž
2015-03-01
Containing the spread of crime in urban societies remains a major challenge. Empirical evidence suggests that, if left unchecked, crimes may be recurrent and proliferate. On the other hand, eradicating a culture of crime may be difficult, especially under extreme social circumstances that impair the creation of a shared sense of social responsibility. Although our understanding of the mechanisms that drive the emergence and diffusion of crime is still incomplete, recent research highlights applied mathematics and methods of statistical physics as valuable theoretical resources that may help us better understand criminal activity. We review different approaches aimed at modeling and improving our understanding of crime, focusing on the nucleation of crime hotspots using partial differential equations, self-exciting point process and agent-based modeling, adversarial evolutionary games, and the network science behind the formation of gangs and large-scale organized crime. We emphasize that statistical physics of crime can relevantly inform the design of successful crime prevention strategies, as well as improve the accuracy of expectations about how different policing interventions should impact malicious human activity that deviates from social norms. We also outline possible directions for future research, related to the effects of social and coevolving networks and to the hierarchical growth of criminal structures due to self-organization. Copyright © 2014 Elsevier B.V. All rights reserved.
Student Understanding of Taylor Series Expansions in Statistical Mechanics
ERIC Educational Resources Information Center
Smith, Trevor I.; Thompson, John R.; Mountcastle, Donald B.
2013-01-01
One goal of physics instruction is to have students learn to make physical meaning of specific mathematical expressions, concepts, and procedures in different physical settings. As part of research investigating student learning in statistical physics, we are developing curriculum materials that guide students through a derivation of the Boltzmann…
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
NASA Astrophysics Data System (ADS)
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
Self-organization and feedback effects in the shock compressed media
NASA Astrophysics Data System (ADS)
Khantuleva, Tatyana
2005-07-01
New theoretical approach to the transport in condensed matter far from equilibrium combines methods of statistical mechanics and cybernetic physics in order to construct closed mathematical model of a system with self-organization and self-regulation. Mesoscopic effects are considered as a result of the structure formation and the feedback effects in an open system under dynamic loading. Nonequilibrium state equations had been involved to incorporate the velocity dispersion. Integrodifferential balance equations describe both wave and dissipative transport properties. Boundary conditions determine the internal scale spectra. The model is completed by the feedback that introduces the structure evolution basing the methods of cybernetic physics. The obtained results open a wide prospective for the control methods in applications to new technologies, intellectual systems and prediction of catastrophic phenomena.
Sculpting bespoke mountains: Determining free energies with basis expansions
NASA Astrophysics Data System (ADS)
Whitmer, Jonathan K.; Fluitt, Aaron M.; Antony, Lucas; Qin, Jian; McGovern, Michael; de Pablo, Juan J.
2015-07-01
The intriguing behavior of a wide variety of physical systems, ranging from amorphous solids or glasses to proteins, is a direct manifestation of underlying free energy landscapes riddled with local minima separated by large barriers. Exploring such landscapes has arguably become one of statistical physics's great challenges. A new method is proposed here for uniform sampling of rugged free energy surfaces. The method, which relies on special Green's functions to approximate the Dirac delta function, improves significantly on existing simulation techniques by providing a boundary-agnostic approach that is capable of mapping complex features in multidimensional free energy surfaces. The usefulness of the proposed approach is established in the context of a simple model glass former and model proteins, demonstrating improved convergence and accuracy over existing methods.
Biological evolution and statistical physics
NASA Astrophysics Data System (ADS)
Drossel, Barbara
2001-03-01
This review is an introduction to theoretical models and mathematical calculations for biological evolution, aimed at physicists. The methods in the field are naturally very similar to those used in statistical physics, although the majority of publications have appeared in biology journals. The review has three parts, which can be read independently. The first part deals with evolution in fitness landscapes and includes Fisher's theorem, adaptive walks, quasispecies models, effects of finite population sizes, and neutral evolution. The second part studies models of coevolution, including evolutionary game theory, kin selection, group selection, sexual selection, speciation, and coevolution of hosts and parasites. The third part discusses models for networks of interacting species and their extinction avalanches. Throughout the review, attention is paid to giving the necessary biological information, and to pointing out the assumptions underlying the models, and their limits of validity.
NASA Astrophysics Data System (ADS)
Smith, A.; Siegel, Edward Carl-Ludwig
2011-03-01
Numbers: primality/indivisibility/non-factorization versus compositeness/divisibility/ factorization, often in tandem but not always, provocatively close analogy to nuclear-physics: (2 + 1)=(fusion)=3; (3+1)=(fission)=4[=2 x 2]; (4+1)=(fusion)=5; (5 +1)=(fission)=6[=2 x 3]; (6 + 1)=(fusion)=7; (7+1)=(fission)=8[= 2 x 4 = 2 x 2 x 2]; (8 + 1) =(non: fission nor fusion)= 9[=3 x 3]; then ONLY composites' Islands of fusion-INstability: 8, 9, 10; then 14, 15, 16, ... Could inter-digit Feshbach-resonances exist??? Possible applications to: quantum-information/ computing non-Shore factorization, millennium-problem Riemann-hypotheses proof as Goodkin BEC intersection with graph-theory "short-cut" method: Rayleigh(1870)-Polya(1922)-"Anderson"(1958)-localization, Goldbach-conjecture, financial auditing/accounting as quantum-statistical-physics; ...abound!!! Watkins [www.secamlocal.ex.ac.uk/people/staff/mrwatkin/] "Number-Theory in Physics" many interconnections: "pure"-maths number-theory to physics including Siegel [AMS Joint Mtg.(2002)-Abs.# 973-60-124] inversion of statistics on-average digits' Newcomb(1881)-Weyl(14-16)-Benford(38)-law to reveal both the quantum and BEQS (digits = bosons = digits:"spinEless-boZos"). 1881 1885 1901 1905 1925 < 1927, altering quantum-theory history!!!
WE-A-201-01: Memorial Introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, C.
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
A baker's dozen of new particle flows for nonlinear filters, Bayesian decisions and transport
NASA Astrophysics Data System (ADS)
Daum, Fred; Huang, Jim
2015-05-01
We describe a baker's dozen of new particle flows to compute Bayes' rule for nonlinear filters, Bayesian decisions and learning as well as transport. Several of these new flows were inspired by transport theory, but others were inspired by physics or statistics or Markov chain Monte Carlo methods.
Reduced Order Modeling Methods for Turbomachinery Design
2009-03-01
and Ma- terials Conference, May 2006. [45] A. Gelman , J. B. Carlin, H. S. Stern, and D. B. Rubin, Bayesian Data Analysis. New York, NY: Chapman I& Hall...Macian- Juan , and R. Chawla, “A statistical methodology for quantif ca- tion of uncertainty in best estimate code physical models,” Annals of Nuclear En
Statistical Physics of Population Genetics in the Low Population Size Limit
NASA Astrophysics Data System (ADS)
Atwal, Gurinder
The understanding of evolutionary processes lends itself naturally to theory and computation, and the entire field of population genetics has benefited greatly from the influx of methods from applied mathematics for decades. However, in spite of all this effort, there are a number of key dynamical models of evolution that have resisted analytical treatment. In addition, modern DNA sequencing technologies have magnified the amount of genetic data available, revealing an excess of rare genetic variants in human genomes, challenging the predictions of conventional theory. Here I will show that methods from statistical physics can be used to model the distribution of genetic variants, incorporating selection and spatial degrees of freedom. In particular, a functional path-integral formulation of the Wright-Fisher process maps exactly to the dynamics of a particle in an effective potential, beyond the mean field approximation. In the small population size limit, the dynamics are dominated by instanton-like solutions which determine the probability of fixation in short timescales. These results are directly relevant for understanding the unusual genetic variant distribution at moving frontiers of populations.
Statistical physics of vaccination
NASA Astrophysics Data System (ADS)
Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei
2016-12-01
Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.
Evaluation on the use of cerium in the NBL Titrimetric Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.
An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less
NASA Astrophysics Data System (ADS)
Tenenbaum, Joel
This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be reproduced in observing the growth rates of the usage of individual words, that just as companies compete for sales in a zero sum marketing game, so do words compete for usage within a limited amount of reader man-hours.
Is the use of physical discipline associated with aggressive behaviors in young children?
Thompson, Richard; Kaczor, Kim; Lorenz, Douglas J.; Bennett, Berkeley L.; Meyers, Gabriel; Pierce, Mary Clyde
2016-01-01
Objectives To determine the association between use of physical discipline and parental report of physically aggressive child behaviors in a cohort of young children who were without indicators of current or past physical abuse Methods The data for this study were analyzed from an initial cohort of patients enrolled in a prospective, observational, multi-center PED-based study investigating bruising and familial psychosocial characteristics of children less than four years of age. Over a seven-month period, structured parental interviews were conducted regarding disciplinary practices, reported child behaviors, and familial psychosocial risk factors. Children with suspected physical abuse were excluded from this study. Trained study staff collected data using standardized questions. Consistent with grounded theory, qualitative coding by two independent individuals was performed using domains rooted in the data. Inter-rater reliability of the coding process was evaluated using the kappa statistic. Descriptive statistics were calculated and multiple logistic regression modeling performed. Results 372 parental interviews were conducted. Parents who reported using physical discipline were 2.8 [95% CI 1.7, 4.5] times more likely to report aggressive child behaviors of hitting/kicking and throwing. Physical discipline was utilized on 38% of children overall, and was 2.4 [95% CI 1.4, 4.1] times more likely to be utilized in families with any of the psychosocial risk factors examined. Conclusions Our findings indicated that the use of physical discipline was associated with higher rates of reported physically aggressive behaviors in early childhood as well as with the presence of familial psychosocial risk factors. PMID:26924534
Pugh, Aaron L.
2014-01-01
Users of streamflow information often require streamflow statistics and basin characteristics at various locations along a stream. The USGS periodically calculates and publishes streamflow statistics and basin characteristics for streamflowgaging stations and partial-record stations, but these data commonly are scattered among many reports that may or may not be readily available to the public. The USGS also provides and periodically updates regional analyses of streamflow statistics that include regression equations and other prediction methods for estimating statistics for ungaged and unregulated streams across the State. Use of these regional predictions for a stream can be complex and often requires the user to determine a number of basin characteristics that may require interpretation. Basin characteristics may include drainage area, classifiers for physical properties, climatic characteristics, and other inputs. Obtaining these input values for gaged and ungaged locations traditionally has been time consuming, subjective, and can lead to inconsistent results.
NASA Technical Reports Server (NTRS)
Gardner, Adrian
2010-01-01
National Aeronautical and Space Administration (NASA) weather and atmospheric environmental organizations are insatiable consumers of geophysical, hydrometeorological and solar weather statistics. The expanding array of internet-worked sensors producing targeted physical measurements has generated an almost factorial explosion of near real-time inputs to topical statistical datasets. Normalizing and value-based parsing of such statistical datasets in support of time-constrained weather and environmental alerts and warnings is essential, even with dedicated high-performance computational capabilities. What are the optimal indicators for advanced decision making? How do we recognize the line between sufficient statistical sampling and excessive, mission destructive sampling ? How do we assure that the normalization and parsing process, when interpolated through numerical models, yields accurate and actionable alerts and warnings? This presentation will address the integrated means and methods to achieve desired outputs for NASA and consumers of its data.
Frank, Lawrence D; Fox, Eric H; Ulmer, Jared M; Chapman, James E; Kershaw, Suzanne E; Sallis, James F; Conway, Terry L; Cerin, Ester; Cain, Kelli L; Adams, Marc A; Smith, Graham R; Hinckson, Erica; Mavoa, Suzanne; Christiansen, Lars B; Hino, Adriano Akira F; Lopes, Adalberto A S; Schipperijn, Jasper
2017-01-23
Advancements in geographic information systems over the past two decades have increased the specificity by which an individual's neighborhood environment may be spatially defined for physical activity and health research. This study investigated how different types of street network buffering methods compared in measuring a set of commonly used built environment measures (BEMs) and tested their performance on associations with physical activity outcomes. An internationally-developed set of objective BEMs using three different spatial buffering techniques were used to evaluate the relative differences in resulting explanatory power on self-reported physical activity outcomes. BEMs were developed in five countries using 'sausage,' 'detailed-trimmed,' and 'detailed,' network buffers at a distance of 1 km around participant household addresses (n = 5883). BEM values were significantly different (p < 0.05) for 96% of sausage versus detailed-trimmed buffer comparisons and 89% of sausage versus detailed network buffer comparisons. Results showed that BEM coefficients in physical activity models did not differ significantly across buffering methods, and in most cases BEM associations with physical activity outcomes had the same level of statistical significance across buffer types. However, BEM coefficients differed in significance for 9% of the sausage versus detailed models, which may warrant further investigation. Results of this study inform the selection of spatial buffering methods to estimate physical activity outcomes using an internationally consistent set of BEMs. Using three different network-based buffering methods, the findings indicate significant variation among BEM values, however associations with physical activity outcomes were similar across each buffering technique. The study advances knowledge by presenting consistently assessed relationships between three different network buffer types and utilitarian travel, sedentary behavior, and leisure-oriented physical activity outcomes.
NASA Technical Reports Server (NTRS)
Petty, Grant W.
1990-01-01
A reasonably rigorous basis for understanding and extracting the physical information content of Special Sensor Microwave/Imager (SSM/I) satellite images of the marine environment is provided. To this end, a comprehensive algebraic parameterization is developed for the response of the SSM/I to a set of nine atmospheric and ocean surface parameters. The brightness temperature model includes a closed-form approximation to microwave radiative transfer in a non-scattering atmosphere and fitted models for surface emission and scattering based on geometric optics calculations for the roughened sea surface. The combined model is empirically tuned using suitable sets of SSM/I data and coincident surface observations. The brightness temperature model is then used to examine the sensitivity of the SSM/I to realistic variations in the scene being observed and to evaluate the theoretical maximum precision of global SSM/I retrievals of integrated water vapor, integrated cloud liquid water, and surface wind speed. A general minimum-variance method for optimally retrieving geophysical parameters from multichannel brightness temperature measurements is outlined, and several global statistical constraints of the type required by this method are computed. Finally, a unified set of efficient statistical and semi-physical algorithms is presented for obtaining fields of surface wind speed, integrated water vapor, cloud liquid water, and precipitation from SSM/I brightness temperature data. Features include: a semi-physical method for retrieving integrated cloud liquid water at 15 km resolution and with rms errors as small as approximately 0.02 kg/sq m; a 3-channel statistical algorithm for integrated water vapor which was constructed so as to have improved linear response to water vapor and reduced sensitivity to precipitation; and two complementary indices of precipitation activity (based on 37 GHz attenuation and 85 GHz scattering, respectively), each of which are relatively insensitive to variations in other environmental parameters.
Statistical mechanics of competitive resource allocation using agent-based models
NASA Astrophysics Data System (ADS)
Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.
2015-01-01
Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.
A statistical theory for sound radiation and reflection from a duct
NASA Technical Reports Server (NTRS)
Cho, Y. C.
1979-01-01
A new analytical method is introduced for the study of the sound radiation and reflection from the open end of a duct. The sound is thought of as an aggregation of the quasiparticles-phonons. The motion of the latter is described in terms of the statistical distribution, which is derived from the classical wave theory. The results are in good agreement with the solutions obtained using the Wiener-Hopf technique when the latter is applicable, but the new method is simple and provides straightforward physical interpretation of the problem. Furthermore, it is applicable to a problem involving a duct in which modes are difficult to determine or cannot be defined at all, whereas the Wiener-Hopf technique is not.
NASA Astrophysics Data System (ADS)
Vallianatos, Filippos
2015-04-01
Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project. References F. Vallianatos, "A non-extensive approach to risk assessment", Nat. Hazards Earth Syst. Sci., 9, 211-216, 2009 F. Vallianatos and P. Sammonds "Is plate tectonics a case of non-extensive thermodynamics?" Physica A: Statistical Mechanics and its Applications, 389 (21), 4989-4993, 2010, F. Vallianatos, G. Michas, G. Papadakis and P. Sammonds " A non extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece)", Acta Geophysica, 60(3), 758-768, 2012 F. Vallianatos and L. Telesca, Statistical mechanics in earth physics and natural hazards (editorial), Acta Geophysica, 60, 3, 499-501, 2012 F. Vallianatos, G. Michas, G. Papadakis and A. Tzanis "Evidence of non-extensivity in the seismicity observed during the 2011-2012 unrest at the Santorini volcanic complex, Greece" Nat. Hazards Earth Syst. Sci.,13,177-185, 2013 F. Vallianatos and P. Sammonds, "Evidence of non-extensive statistical physics of the lithospheric instability approaching the 2004 Sumatran-Andaman and 2011 Honshu mega-earthquakes" Tectonophysics, 590 , 52-58, 2013 G. Papadakis, F. Vallianatos, P. Sammonds, " Evidence of Nonextensive Statistical Physics behavior of the Hellenic Subduction Zone seismicity" Tectonophysics, 608, 1037 -1048, 2013 G. Michas, F. Vallianatos, and P. Sammonds, Non-extensivity and long-range correlations in the earthquake activity at the West Corinth rift (Greece) Nonlin. Processes Geophys., 20, 713-724, 2013
MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, M; Petrick, N; Obuchowski, N
The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. Asmore » such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.« less
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
Verbal and physical abuse against nurses in Turkey.
Celik, S S; Celik, Y; Ağirbaş, I; Uğurluoğlu, O
2007-12-01
This study of verbal and physical abuse against nurses in Turkey aimed to describe prevalence, sources, important effects on work, family and social life of the nurses, coping methods and factors. A sample of 622 nurses working in eight hospitals located in the capital city of Turkey was surveyed using verbal and physical abuse questionnaires. The prevalence of verbal and physical abuse against nurses in the sample of this study were found to be as 91.1% and 33.0% respectively. Colleagues were found to be the most important source of verbally abusive behaviours while patients and patients' relatives were the important sources of physically abusive behaviours. Disturbed mental health, decreased job performance and headache were the more frequently reported negative effects of verbal and physical abuses on nurses. The most common reactions against abusive behaviours were anger, helplessness, humiliation and depression. It is interesting to find that 'did nothing' was the most reported coping method with verbal abuse. The findings also suggested that working in inpatient units and increasing work experience in the nursing profession were statistically significant variables increasing the likelihood of being abused physically. All the results on sources, negative effects, feelings and coping methods on verbally and physically abusive behaviours lead us to discuss that lower working status and power of the nurses at the work, poor working conditions in healthcare settings and insufficient administrative mechanisms as well as law and regulations against the abusers are the important factors forcing the nurses to work in an inappropriate work environment in Turkey.
Physics Teachers and Students: A Statistical and Historical Analysis of Women
NASA Astrophysics Data System (ADS)
Gregory, Amanda
2009-10-01
Historically, women have been denied an education comparable to that available to men. Since women have been allowed into institutions of higher learning, they have been studying and earning physics degrees. The aim of this poster is to discuss the statistical relationship between the number of women enrolled in university physics programs and the number of female physics faculty members. Special care has been given to examining the statistical data in the context of the social climate at the time that these women were teaching or pursuing their education.
Health-Related Quality-of-Life Findings for the Prostate Cancer Prevention Trial
2012-01-01
Background The Prostate Cancer Prevention Trial (PCPT)—a randomized placebo-controlled study of the efficacy of finasteride in preventing prostate cancer—offered the opportunity to prospectively study effects of finasteride and other covariates on the health-related quality of life of participants in a multiyear trial. Methods We assessed three health-related quality-of-life domains (measured with the Health Survey Short Form–36: Physical Functioning, Mental Health, and Vitality scales) via questionnaires completed by PCPT participants at enrollment (3 months before randomization), at 6 months after randomization, and annually for 7 years. Covariate data obtained at enrollment from patient-completed questionnaires were included in our model. Mixed-effects model analyses and a cross-sectional presentation at three time points began at 6 months after randomization. All statistical tests were two-sided. Results For the physical function outcome (n = 16 077), neither the finasteride main effect nor the finasteride interaction with time were statistically significant. The effects of finasteride on physical function were minor and accounted for less than a 1-point difference over time in Physical Functioning scores (mixed-effect estimate = 0.07, 95% confidence interval [CI] = −0.28 to 0.42, P = .71). Comorbidities such as congestive heart failure (estimate = −5.64, 95% CI = −7.96 to −3.32, P < .001), leg pain (estimate = −2.57, 95% CI = −3.04 to −2.10, P < .001), and diabetes (estimate = −1.31, 95% CI = −2.04 to −0.57, P < .001) had statistically significant negative effects on physical function, as did current smoking (estimate = −2.34, 95% CI = −2.97 to −1.71, P < .001) and time on study (estimate = −1.20, 95% CI = −1.36 to −1.03, P < .001). Finasteride did not have a statistically significant effect on the other two dependent variables, mental health and vitality, either in the mixed-effects analyses or in the cross-sectional analysis at any of the three time points. Conclusion Finasteride did not negatively affect SF–36 Physical Functioning, Mental Health, or Vitality scores. PMID:22972968
Non-equilibrium statistical mechanics theory for the large scales of geophysical flows
NASA Astrophysics Data System (ADS)
Eric, S.; Bouchet, F.
2010-12-01
The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.
NASA Astrophysics Data System (ADS)
Shvartsburg, Alexandre A.; Siu, K. W. Michael
2001-06-01
Modeling the delayed dissociation of clusters had been over the last decade a frontline development area in chemical physics. It is of fundamental interest how statistical kinetics methods previously validated for regular molecules and atomic nuclei may apply to clusters, as this would help to understand the transferability of statistical models for disintegration of complex systems across various classes of physical objects. From a practical perspective, accurate simulation of unimolecular decomposition is critical for the extraction of true thermochemical values from measurements on the decay of energized clusters. Metal clusters are particularly challenging because of the multitude of low-lying electronic states that are coupled to vibrations. This has previously been accounted for assuming the average electronic structure of a conducting cluster approximated by the levels of electron in a cavity. While this provides a reasonable time-averaged description, it ignores the distribution of instantaneous electronic structures in a "boiling" cluster around that average. Here we set up a new treatment that incorporates the statistical distribution of electronic levels around the average picture using random matrix theory. This approach faithfully reflects the completely chaotic "vibronic soup" nature of hot metal clusters. We found that the consideration of electronic level statistics significantly promotes electronic excitation and thus increases the magnitude of its effect. As this excitation always depresses the decay rates, the inclusion of level statistics results in slower dissociation of metal clusters.
NASA Astrophysics Data System (ADS)
Zhao, Runchen; Ientilucci, Emmett J.
2017-05-01
Hyperspectral remote sensing systems provide spectral data composed of hundreds of narrow spectral bands. Spectral remote sensing systems can be used to identify targets, for example, without physical interaction. Often it is of interested to characterize the spectral variability of targets or objects. The purpose of this paper is to identify and characterize the LWIR spectral variability of targets based on an improved earth observing statistical performance model, known as the Forecasting and Analysis of Spectroradiometric System Performance (FASSP) model. FASSP contains three basic modules including a scene model, sensor model and a processing model. Instead of using mean surface reflectance only as input to the model, FASSP transfers user defined statistical characteristics of a scene through the image chain (i.e., from source to sensor). The radiative transfer model, MODTRAN, is used to simulate the radiative transfer based on user defined atmospheric parameters. To retrieve class emissivity and temperature statistics, or temperature / emissivity separation (TES), a LWIR atmospheric compensation method is necessary. The FASSP model has a method to transform statistics in the visible (ie., ELM) but currently does not have LWIR TES algorithm in place. This paper addresses the implementation of such a TES algorithm and its associated transformation of statistics.
Revealing physical interaction networks from statistics of collective dynamics
Nitzan, Mor; Casadiego, Jose; Timme, Marc
2017-01-01
Revealing physical interactions in complex systems from observed collective dynamics constitutes a fundamental inverse problem in science. Current reconstruction methods require access to a system’s model or dynamical data at a level of detail often not available. We exploit changes in invariant measures, in particular distributions of sampled states of the system in response to driving signals, and use compressed sensing to reveal physical interaction networks. Dynamical observations following driving suffice to infer physical connectivity even if they are temporally disordered, are acquired at large sampling intervals, and stem from different experiments. Testing various nonlinear dynamic processes emerging on artificial and real network topologies indicates high reconstruction quality for existence as well as type of interactions. These results advance our ability to reveal physical interaction networks in complex synthetic and natural systems. PMID:28246630
Sangeux, Morgan; Mahy, Jessica; Graham, H Kerr
2014-01-01
Informed clinical decision making for femoral and/or tibial de-rotation osteotomies requires accurate measurement of patient function through gait analysis and anatomy through physical examination of bony torsions. Validity of gait analysis has been extensively studied; however, controversy remains regarding the accuracy of physical examination measurements of femoral and tibial torsion. Comparison between CT-scans and physical examination measurements of femoral neck anteversion (FNA) and external tibial torsion (ETT) were retrospectively obtained for 98 (FNA) and 64 (ETT) patients who attended a tertiary hospital for instrumented gait analysis between 2007 and 2010. The physical examination methods studied for femoral neck anteversion were the trochanteric prominence angle test (TPAT) and the maximum hip rotation arc midpoint (Arc midpoint) and for external tibial torsion the transmalleolar axis (TMA). Results showed that all physical examination measurements statistically differed to the CT-scans (bias(standard deviation): -2(14) for TPAT, -10(12) for Arc midpoint and -16(9) for TMA). Bland and Altman plots showed that method disagreements increased with increasing bony torsions in all cases but notably for TPAT. Regression analysis showed that only TMA and CT-scan measurement of external tibial torsion demonstrated good (R(2)=57%) correlation. Correlations for both TPAT (R(2)=14%) and Arc midpoint (R(2)=39%) with CT-scan measurements of FNA were limited. We conclude that physical examination should be considered as screening techniques rather than definitive measurement methods for FNA and ETT. Further research is required to develop more accurate measurement methods to accompany instrumented gait analysis. Copyright © 2013. Published by Elsevier B.V.
Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin
2014-01-01
The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al. 2012 Proc. R. Soc. A 468, 1799–1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi–Dirac or Bose–Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas. PMID:24399919
Statistical Moments in Variable Density Incompressible Mixing Flows
2015-08-28
front tracking method: Verification and application to simulation of the primary breakup of a liquid jet . SIAM J. Sci. Comput., 33:1505–1524, 2011. [15... elliptic problem. In case of failure, Generalized Minimal Residual (GMRES) method [78] is used instead. Then update face velocities as follows: u n+1...of the ACM Solid and Physical Modeling Symposium, pages 159–170, 2008. [51] D. D. Joseph. Fluid dynamics of two miscible liquids with diffusion and
Didarloo, Alireza; Shojaeizadeh, Davoud; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad
2011-10-01
Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease.
NASA Astrophysics Data System (ADS)
Vavylonis, Dimitrios
2009-03-01
I will describe my experience in developing an interdisciplinary biophysics course addressed to students at the upper undergraduate and graduate level, in collaboration with colleagues in physics and biology. The students had a background in physics, biology and engineering, and for many the course was their first exposure to interdisciplinary topics. The course did not depend on a formal knowledge of equilibrium statistical mechanics. Instead, the approach was based on dynamics. I used diffusion as a universal ``long time'' law to illustrate scaling concepts. The importance of statistics and proper counting of states/paths was introduced by calculating the maximum accuracy with which bacteria can measure the concentration of diffuse chemicals. The use of quantitative concepts and methods was introduced through specific biological examples, focusing on model organisms and extremes at the cell level. Examples included microtubule dynamic instability, the search and capture model, molecular motor cooperativity in muscle cells, mitotic spindle oscillations in C. elegans, polymerization forces and propulsion of pathogenic bacteria, Brownian ratchets, bacterial cell division and MinD oscillations.
Lee, Han Suk; Park, Jeung Hun
2015-08-01
[Purpose] This study investigated the effects of Nordic walking on physical functions and depression in frail people aged 70 years and above. [Subjects] Twenty frail elderly individuals ≥70 years old were assigned to either a Nordic walking group (n=8) or general exercise group (n=10). [Methods] The duration of intervention was equal in both groups (3 sessions/week for 12 weeks, 60 min/session). Physical function (balance, upper extremity strength, lower extremity strength, weakness) and depression were examined before and after the interventions. [Results] With the exception of upper extremity muscle strength, lower extremity strength, weakness, balance, and depression after Nordic walking demonstrated statistically significant improvement. However, in the general exercise group, only balance demonstrated a statistically significant improvement after the intervention. There were significant differences in the changes in lower extremity muscle strength, weakness and depression between the groups. [Conclusion] In conclusion, Nordic walking was more effective than general exercise. Therefore, we suggest that Nordic walking may be an attractive option for significant functional improvement in frail people over 70 years old.
Decompression Mechanisms and Decompression Schedule Calculations.
1984-01-20
phisiology - The effects of altitude. Handbook of Physiology, Section 3: Respiration, Vol. II. W.O. Fenn and H. Rahn eds. Wash, D.C.; Am. Physiol. Soc. 1 4...decompression studies from other laboratories. METHODS Ten experienced and physically qualified divers ( ages 22-42) were compressed at a rate of 60...STATISTICS* --- ---------------------------------------------------------- EXPERIMENT N AGE (yr) HEIGHT (cm) WEIGHT (Kg) BODY FAT
Correlation Between University Students' Kinematic Achievement and Learning Styles
NASA Astrophysics Data System (ADS)
Çirkinoǧlu, A. G.; Dem&ircidot, N.
2007-04-01
In the literature, some researches on kinematics revealed that students have many difficulties in connecting graphs and physics. Also some researches showed that the method used in classroom affects students' further learning. In this study the correlation between university students' kinematics achieve and learning style are investigated. In this purpose Kinematics Achievement Test and Learning Style Inventory were applied to 573 students enrolled in general physics 1 courses at Balikesir University in the fall semester of 2005-2006. Kinematics Test, consists of 12 multiple choose and 6 open ended questions, was developed by researchers to assess students' understanding, interpreting, and drawing graphs. Learning Style Inventory, a 24 items test including visual, auditory, and kinesthetic learning styles, was developed and used by Barsch. The data obtained from in this study were analyzed necessary statistical calculations (T-test, correlation, ANOVA, etc.) by using SPSS statistical program. Based on the research findings, the tentative recommendations are made.
Physics of Electronic Materials
NASA Astrophysics Data System (ADS)
Rammer, Jørgen
2017-03-01
1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.
Statistical Physics Approaches to RNA Editing
NASA Astrophysics Data System (ADS)
Bundschuh, Ralf
2012-02-01
The central dogma of molecular Biology states that DNA is transcribed base by base into RNA which is in turn translated into proteins. However, some organisms edit their RNA before translation by inserting, deleting, or substituting individual or short stretches of bases. In many instances the mechanisms by which an organism recognizes the positions at which to edit or by which it performs the actual editing are unknown. One model system that stands out by its very high rate of on average one out of 25 bases being edited are the Myxomycetes, a class of slime molds. In this talk we will show how the computational methods and concepts from statistical Physics can be used to analyze DNA and protein sequence data to predict editing sites in these slime molds and to guide experiments that identified previously unknown types of editing as well as the complete set of editing events in the slime mold Physarum polycephalum.
Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan
NASA Astrophysics Data System (ADS)
Ichiyanagi, Masakazu
1995-11-01
This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.
Evolutionary dynamics of group interactions on structured populations: a review
Perc, Matjaž; Gómez-Gardeñes, Jesús; Szolnoki, Attila; Floría, Luis M.; Moreno, Yamir
2013-01-01
Interactions among living organisms, from bacteria colonies to human societies, are inherently more complex than interactions among particles and non-living matter. Group interactions are a particularly important and widespread class, representative of which is the public goods game. In addition, methods of statistical physics have proved valuable for studying pattern formation, equilibrium selection and self-organization in evolutionary games. Here, we review recent advances in the study of evolutionary dynamics of group interactions on top of structured populations, including lattices, complex networks and coevolutionary models. We also compare these results with those obtained on well-mixed populations. The review particularly highlights that the study of the dynamics of group interactions, like several other important equilibrium and non-equilibrium dynamical processes in biological, economical and social sciences, benefits from the synergy between statistical physics, network science and evolutionary game theory. PMID:23303223
Colloquium: Hierarchy of scales in language dynamics
NASA Astrophysics Data System (ADS)
Blythe, Richard A.
2015-11-01
Methods and insights from statistical physics are finding an increasing variety of applications where one seeks to understand the emergent properties of a complex interacting system. One such area concerns the dynamics of language at a variety of levels of description, from the behaviour of individual agents learning simple artificial languages from each other, up to changes in the structure of languages shared by large groups of speakers over historical timescales. In this Colloquium, we survey a hierarchy of scales at which language and linguistic behaviour can be described, along with the main progress in understanding that has been made at each of them - much of which has come from the statistical physics community. We argue that future developments may arise by linking the different levels of the hierarchy together in a more coherent fashion, in particular where this allows more effective use of rich empirical data sets.
Between disorder and order: A case study of power law
NASA Astrophysics Data System (ADS)
Cao, Yong; Zhao, Youjie; Yue, Xiaoguang; Xiong, Fei; Sun, Yongke; He, Xin; Wang, Lichao
2016-08-01
Power law is an important feature of phenomena in long memory behaviors. Zipf ever found power law in the distribution of the word frequencies. In physics, the terms order and disorder are Thermodynamic or statistical physics concepts originally and a lot of research work has focused on self-organization of the disorder ingredients of simple physical systems. It is interesting what make disorder-order transition. We devise an experiment-based method about random symbolic sequences to research regular pattern between disorder and order. The experiment results reveal power law is indeed an important regularity in transition from disorder to order. About these results the preliminary study and analysis has been done to explain the reasons.
Pedzikiewicz, J; Sobiech, K A
1995-01-01
Nine men were examined during a three-week training requiring much physical effort. They were given nutrient, "LIVEX", enriched with iron. Hematological parameters as well as concentration of erythrocyte ATP and 2,3-DPG were determined before and after the experiment. Hematological parameters were determined using standard methods while Boehringer's test (Germany) was used for determining ATP and 2,3-DPG. The level of reticular cells was statistically higher after the experiment, and the increase in ATP and 2,3-DPG concentration was insignificant. A positive adaptation of energy metabolism after exogenous iron administration during physical effort was discussed.
Analysis of entropy extraction efficiencies in random number generation systems
NASA Astrophysics Data System (ADS)
Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu
2016-05-01
Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.
Modern morphometry: new perspectives in physical anthropology.
Mantini, Simone; Ripani, Maurizio
2009-06-01
In the past one hundred years physical anthropology has recourse to more and more efficient methods, which provide several new information regarding, human evolution and biology. Apart from the molecular approach, the introduction of new computed assisted techniques gave rise to a new concept of morphometry. Computed tomography and 3D-imaging, allowed providing anatomical description of the external and inner structures exceeding the problems encountered with the traditional morphometric methods. Furthermore, the support of geometric morphometrics, allowed creating geometric models to investigate morphological variation in terms of evolution, ontogeny and variability. The integration of these new tools gave rise to the virtual anthropology and to a new image of the anthropologist in which anatomical, biological, mathematical statistical and data processing information are fused in a multidisciplinary approach.
Quantifying fluctuations in economic systems by adapting methods of statistical physics
NASA Astrophysics Data System (ADS)
Stanley, H. E.; Gopikrishnan, P.; Plerou, V.; Amaral, L. A. N.
2000-12-01
The emerging subfield of econophysics explores the degree to which certain concepts and methods from statistical physics can be appropriately modified and adapted to provide new insights into questions that have been the focus of interest in the economics community. Here we give a brief overview of two examples of research topics that are receiving recent attention. A first topic is the characterization of the dynamics of stock price fluctuations. For example, we investigate the relation between trading activity - measured by the number of transactions NΔ t - and the price change GΔ t for a given stock, over a time interval [t, t+ Δt] . We relate the time-dependent standard deviation of price fluctuations - volatility - to two microscopic quantities: the number of transactions NΔ t in Δ t and the variance WΔ t2 of the price changes for all transactions in Δ t. Our work indicates that while the pronounced tails in the distribution of price fluctuations arise from WΔ t, the long-range correlations found in ∣ GΔ t∣ are largely due to NΔ t. We also investigate the relation between price fluctuations and the number of shares QΔ t traded in Δ t. We find that the distribution of QΔ t is consistent with a stable Lévy distribution, suggesting a Lévy scaling relationship between QΔ t and NΔ t, which would provide one explanation for volume-volatility co-movement. A second topic concerns cross-correlations between the price fluctuations of different stocks. We adapt a conceptual framework, random matrix theory (RMT), first used in physics to interpret statistical properties of nuclear energy spectra. RMT makes predictions for the statistical properties of matrices that are universal, that is, do not depend on the interactions between the elements comprising the system. In physics systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system, so this framework can be of potential value if applied to economic systems. We discuss a systematic comparison between the statistics of the cross-correlation matrix C - whose elements Cij are the correlation-coefficients between the returns of stock i and j - and that of a random matrix having the same symmetry properties. Our work suggests that RMT can be used to distinguish random and non-random parts of C; the non-random part of C, which deviates from RMT results provides information regarding genuine cross-correlations between stocks.
Skurdenyte, Vaida; Surkiene, Gene; Stukas, Rimantas; Zagminas, Kestutis; Giedraitis, Vincentas
2015-01-01
Background Evaluation of eating habits and physical activity is very important for health interventions. Our aim in this study was to assess the characteristics of eating and physical activity of 6–7th grade schoolchildren in the city of Vilnius, Lithuania, as well as the association between dietary habits and physical activity. Methods The study was conducted within the project “Education of healthy diets and physical activity in schools”. The sample consisted of 1008 schoolchildren from 22 schools in the city of Vilnius, and was based on empirical methods, including a questionnaire poll and comparative analysis. Statistical software Stata v.12.1 (Stata corp LP) was used to analyze the data. Results Our study showed that less than half (37.1%) of study participants had physically active leisure time. Boys were significantly more physically active than girls. More than half (61.4%) of children ate breakfast every day. Girls were more likely to eat vegetables and sweets. Schoolchildren who ate vegetables and dairy products as well as those who got enough information about physical activity and spoke about it with their family members were more physically active. Conclusions The results of the study confirmed that schoolchildren were not sufficiently physically active. It was found that low physical activity is related to dietary and other factors, such as lack of information about physical activity and its benefits. PMID:28352688
NASA Technical Reports Server (NTRS)
Djorgovski, George
1993-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.
NASA Technical Reports Server (NTRS)
Djorgovski, Stanislav
1992-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.
2013-01-01
Background Early education on the foundations of evidence based practice (EBP) is advocated as a potent intervention toward enhancing EBP uptake among physical therapists. Little is known about the extent to which EBP is integrated in educational curricula in developing countries where the benefits of EBP are more acutely needed. This study sought to describe EBP education in Philippine physical therapy schools, including the challenges encountered by educators in teaching EBP. Methods A national survey of higher education institutions offering an undergraduate degree program in physical therapy was conducted from August 2011 through January 2012. A 35-item questionnaire was developed to gather data on whether or not EBP was taught, specific EBP content covered and courses in which content was covered, teaching and evaluation methods, and challenges in teaching EBP. Data were analyzed descriptively. Results The study had a response rate of 55.7% (34/61). Majority of the participating educational institutions (82%, 28/34) reported teaching EBP by incorporating EBP content in the professional courses. Among those that did not teach EBP, inadequate educator competence was the leading barrier. Courses commonly used to teach EBP were those on research (78.6%, 22/28), therapy planning (71.4%, 20/28), treatment skills (57.1-64.3%, 16-18/28), and undergraduate thesis (60.7%, 17/28). Various EBP contents were covered, with statistical concepts more frequently taught compared with critical EBP content. Lectures and journal reports were the usual teaching methods (96.4%, 27/28 and 89.3%, 25/28, respectively) while written examinations, completion of an undergraduate thesis, and oral reports (82.1%, 23/28, 78.6%, 22/28, and 78.6%, 22/28, respectively) were often used in evaluation. Students’ inadequate knowledge of statistics and lack of curricular structure for EBP were identified as leading challenges to teaching (75%, 21/28 and 50%, 14/28, respectively). Conclusions Many physical therapy faculties across the Philippines are incorporating EBP content in teaching. However, there is arbitrary and fragmented coverage of EBP content and inadequate emphasis on clinically oriented teaching-learning and assessment methods. These findings suggest the need to design appropriate entry-level educational programs on EBP. Effective ‘educating the educators’ strategies are urgently needed and can have far-reaching positive repercussions on EBP uptake in physical therapist practice. PMID:24267512
Practice of Iranian Adolescents with Hemophilia in Prevention of Complications of Hemophilia
Valizadeh, Leila; Hosseini, Fahimeh Alsadat; Zamanzadeh, Vahid; Heidarnezhad, Fatemeh; Jasemi, Madineh; Lankarani, Kamran Bagheri
2015-01-01
Background: Prerequisite for management of a chronic disease involves knowledge about its complications and their prevention. Hemophilia in adolescents influences all the aspects of their lives and thier performance. Objectives: The present study aimed to determine the performance of Iranian hemophilic adolescents in prevention of disease complications. Patients and Methods: In this descriptive-analytical study, 108 adolescents with hemophilia were selected through convenience sampling. Their performance in preventing the complications of hemophilia was evaluated by sending a semi-structured questionnaire to their addresses throughout Iran. Then, the data was analysed using the Statistical Package for Social Sciences (SPSS) software (v. 13) and descriptive and interferential statistics were used. Results: Overall, 32.1% of the participants controlled bleeding during the 1st hour. Inaccessibility of coagulation products was mainly responsible for inhibiting timely and proper bleeding control. In order to relieve bleeding associated pain, only 39.0% of the adolescents used analgesics. On the other hand, 19.8% of the subjects used nonpharmacological methods to relieve pain. The majority of the adolescents did not participate in sport activities (65.4%) others allocated less than 5 hours a week to physical activities (70.5%). In addition, the participants did not have favorable dietary patterns, exercise habits, and dental care. The results showed a significant relationship between the adolescents’ preventive practice with coagulation disorders and utilization of pharmacological pain relief methods. Also, significant relationships were found between severity of the disease; participating in physical activities, number of hours of physical activities; and disease complications. Conclusions: Iranian adolescents did not exhibit favorable practices towards complication prevention. PMID:26600702
Retrieving cloudy atmosphere parameters from RPG-HATPRO radiometer data
NASA Astrophysics Data System (ADS)
Kostsov, V. S.
2015-03-01
An algorithm for simultaneously determining both tropospheric temperature and humidity profiles and cloud liquid water content from ground-based measurements of microwave radiation is presented. A special feature of this algorithm is that it combines different types of measurements and different a priori information on the sought parameters. The features of its use in processing RPG-HATPRO radiometer data obtained in the course of atmospheric remote sensing experiments carried out by specialists from the Faculty of Physics of St. Petersburg State University are discussed. The results of a comparison of both temperature and humidity profiles obtained using a ground-based microwave remote sensing method with those obtained from radiosonde data are analyzed. It is shown that this combined algorithm is comparable (in accuracy) to the classical method of statistical regularization in determining temperature profiles; however, this algorithm demonstrates better accuracy (when compared to the method of statistical regularization) in determining humidity profiles.
Near-equilibrium dumb-bell-shaped figures for cohesionless small bodies
NASA Astrophysics Data System (ADS)
Descamps, Pascal
2016-02-01
In a previous paper (Descamps, P. [2015]. Icarus 245, 64-79), we developed a specific method aimed to retrieve the main physical characteristics (shape, density, surface scattering properties) of highly elongated bodies from their rotational lightcurves through the use of dumb-bell-shaped equilibrium figures. The present work is a test of this method. For that purpose we introduce near-equilibrium dumb-bell-shaped figures which are base dumb-bell equilibrium shapes modulated by lognormal statistics. Such synthetic irregular models are used to generate lightcurves from which our method is successfully applied. Shape statistical parameters of such near-equilibrium dumb-bell-shaped objects are in good agreement with those calculated for example for the Asteroid (216) Kleopatra from its dog-bone radar model. It may suggest that such bilobed and elongated asteroids can be approached by equilibrium figures perturbed be the interplay with a substantial internal friction modeled by a Gaussian random sphere.
Huh, Jung-Bo; Lee, Jeong-Yeol; Jeon, Young-Chan; Shin, Sang-Wan; Ahn, Jin-Soo; Ryu, Jae-Jun
2013-05-01
The aim of this study was to evaluate the stability of arginine-glycine-aspartic acid (RGD) peptide coatings on implants by measuring the amount of peptide remaining after installation. Fluorescent isothiocyanate (FITC)-fixed RGD peptide was coated onto anodized titanium implants (width 4 mm, length 10 mm) using a physical adsorption method (P) or a chemical grafting method (C). Solid Rigid Polyurethane Foam (SRPF) was classified as either hard bone (H) or soft bone (S) according to its density. Two pieces of artificial bone were fixed in a customized jig, and coated implants were installed at the center of the boundary between two pieces of artificial bone. The test groups were classified as: P-H, P-S, C-H, or C-S. After each installation, implants were removed from the SRPF, and the residual amounts and rates of RGD peptide in implants were measured by fluorescence spectrometry. The Kruskal-Wallis test was used for the statistical analysis (α=0.05). Peptide-coating was identified by fluorescence microscopy and XPS. Total coating amount was higher for physical adsorption than chemical grafting. The residual rate of peptide was significantly larger in the P-S group than in the other three groups (P<.05). The result of this study suggests that coating doses depend on coating method. Residual amounts of RGD peptide were greater for the physical adsorption method than the chemical grafting method.
Statistical Analysis of the Uncertainty in Pre-Flight Aerodynamic Database of a Hypersonic Vehicle
NASA Astrophysics Data System (ADS)
Huh, Lynn
The objective of the present research was to develop a new method to derive the aerodynamic coefficients and the associated uncertainties for flight vehicles via post- flight inertial navigation analysis using data from the inertial measurement unit. Statistical estimates of vehicle state and aerodynamic coefficients are derived using Monte Carlo simulation. Trajectory reconstruction using the inertial navigation system (INS) is a simple and well used method. However, deriving realistic uncertainties in the reconstructed state and any associated parameters is not so straight forward. Extended Kalman filters, batch minimum variance estimation and other approaches have been used. However, these methods generally depend on assumed physical models, assumed statistical distributions (usually Gaussian) or have convergence issues for non-linear problems. The approach here assumes no physical models, is applicable to any statistical distribution, and does not have any convergence issues. The new approach obtains the statistics directly from a sufficient number of Monte Carlo samples using only the generally well known gyro and accelerometer specifications and could be applied to the systems of non-linear form and non-Gaussian distribution. When redundant data are available, the set of Monte Carlo simulations are constrained to satisfy the redundant data within the uncertainties specified for the additional data. The proposed method was applied to validate the uncertainty in the pre-flight aerodynamic database of the X-43A Hyper-X research vehicle. In addition to gyro and acceleration data, the actual flight data include redundant measurements of position and velocity from the global positioning system (GPS). The criteria derived from the blend of the GPS and INS accuracy was used to select valid trajectories for statistical analysis. The aerodynamic coefficients were derived from the selected trajectories by either direct extraction method based on the equations in dynamics, or by the inquiry of the pre-flight aerodynamic database. After the application of the proposed method to the case of the X-43A Hyper-X research vehicle, it was found that 1) there were consistent differences in the aerodynamic coefficients from the pre-flight aerodynamic database and post-flight analysis, 2) the pre-flight estimation of the pitching moment coefficients was significantly different from the post-flight analysis, 3) the type of distribution of the states from the Monte Carlo simulation were affected by that of the perturbation parameters, 4) the uncertainties in the pre-flight model were overestimated, 5) the range where the aerodynamic coefficients from the pre-flight aerodynamic database and post-flight analysis are in closest agreement is between Mach *.* and *.* and more data points may be needed between Mach * and ** in the pre-flight aerodynamic database, 6) selection criterion for valid trajectories from the Monte Carlo simulations was mostly driven by the horizontal velocity error, 7) the selection criterion must be based on reasonable model to ensure the validity of the statistics from the proposed method, and 8) the results from the proposed method applied to the two different flights with the identical geometry and similar flight profile were consistent.
Statistical scaling of geometric characteristics in stochastically generated pore microstructures
Hyman, Jeffrey D.; Guadagnini, Alberto; Winter, C. Larrabee
2015-05-21
In this study, we analyze the statistical scaling of structural attributes of virtual porous microstructures that are stochastically generated by thresholding Gaussian random fields. Characterization of the extent at which randomly generated pore spaces can be considered as representative of a particular rock sample depends on the metrics employed to compare the virtual sample against its physical counterpart. Typically, comparisons against features and/patterns of geometric observables, e.g., porosity and specific surface area, flow-related macroscopic parameters, e.g., permeability, or autocorrelation functions are used to assess the representativeness of a virtual sample, and thereby the quality of the generation method. Here, wemore » rely on manifestations of statistical scaling of geometric observables which were recently observed in real millimeter scale rock samples [13] as additional relevant metrics by which to characterize a virtual sample. We explore the statistical scaling of two geometric observables, namely porosity (Φ) and specific surface area (SSA), of porous microstructures generated using the method of Smolarkiewicz and Winter [42] and Hyman and Winter [22]. Our results suggest that the method can produce virtual pore space samples displaying the symptoms of statistical scaling observed in real rock samples. Order q sample structure functions (statistical moments of absolute increments) of Φ and SSA scale as a power of the separation distance (lag) over a range of lags, and extended self-similarity (linear relationship between log structure functions of successive orders) appears to be an intrinsic property of the generated media. The width of the range of lags where power-law scaling is observed and the Hurst coefficient associated with the variables we consider can be controlled by the generation parameters of the method.« less
Thompson, Kris; Coon, Jill; Handford, Leandrea
2011-01-01
With the move to the doctor of physical therapy (DPT) degree and increasing tuition costs, there is concern about financing entry-level education. The purposes of this study were to identify how students finance their DPT education and to describe the financial impact after graduation. A written survey was used to collect data on financing DPT education, student debt, and the financial impact on graduates. There were 92 subjects who had graduated from one program. Frequencies as well as nonparametric statistics using cross-tabulations and chi-squared statistics were calculated. The response rate was 55%. Of the respondents, 86% had student loans, 66% worked during school, 57% received some family assistance, and 21% had some scholarship support. The amount of monthly loan repayment was not statistically related to the ability to save for a house, the ability to obtain a loan for a house or car, or the decision to have children. Saving for the future (p = 0.016) and lifestyle choices (p = 0.035) were related to the amount of monthly loan repayment. Major sources of funding were student loans, employment income, and/or family assistance. Respondent's ability to save for the future and lifestyle choices were negatively impacted when loan debt increased. Physical therapist education programs should consider offering debt planning and counseling.
NASA Astrophysics Data System (ADS)
Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng
2018-02-01
Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.
Ma, Chuang; Chen, Han-Shuang; Lai, Ying-Cheng; Zhang, Hai-Feng
2018-02-01
Complex networks hosting binary-state dynamics arise in a variety of contexts. In spite of previous works, to fully reconstruct the network structure from observed binary data remains challenging. We articulate a statistical inference based approach to this problem. In particular, exploiting the expectation-maximization (EM) algorithm, we develop a method to ascertain the neighbors of any node in the network based solely on binary data, thereby recovering the full topology of the network. A key ingredient of our method is the maximum-likelihood estimation of the probabilities associated with actual or nonexistent links, and we show that the EM algorithm can distinguish the two kinds of probability values without any ambiguity, insofar as the length of the available binary time series is reasonably long. Our method does not require any a priori knowledge of the detailed dynamical processes, is parameter-free, and is capable of accurate reconstruction even in the presence of noise. We demonstrate the method using combinations of distinct types of binary dynamical processes and network topologies, and provide a physical understanding of the underlying reconstruction mechanism. Our statistical inference based reconstruction method contributes an additional piece to the rapidly expanding "toolbox" of data based reverse engineering of complex networked systems.
May, Philip A.; Tabachnick, Barbara G.; Gossage, J. Phillip; Kalberg, Wendy O.; Marais, Anna-Susan; Robinson, Luther K.; Manning, Melanie A.; Blankenship, Jason; Buckley, David; Hoyme, H. Eugene; Adnams, Colleen M.
2013-01-01
Objective To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASD). Method Multivariate correlation techniques were employed with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and employed in structural equation models (SEM) to assess correlates of child intelligence (verbal and non-verbal) and behavior. Results A first SEM utilizing only seven maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05), but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status (SES), and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model, and were overpowered by SES and maternal physical traits. Conclusions While other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly-controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD. PMID:23751886
The effect of physical activity homework on physical activity among college students.
Claxton, David; Wells, Gayle M
2009-03-01
This study examined the effect of using physical activity homework on physical activity levels of college students. Students in randomly assigned sections of a university health course were assigned 30 minutes of physical activity homework 3 days a week or no homework for 12 weeks. Participants completed self-reports of physical activity before the homework intervention and again at the conclusion of the 12 weeks of physical activity homework. Participants in all course sections reported significant increases in the number of days per week of moderate and vigorous physical activity. Participants in homework sections additionally showed significant increases in the days they engaged in muscular strength/endurance training and activities to manage weight. Participants in sections without homework showed a significant increase in the number of days engaged in flexibility training. Comparison of gain scores showed statistically significant increases by the homework group in the days they participated in activities designed to manage weight. Physical activity homework was deemed to be an effective method of increasing college students' levels of physical activity.
Lee, Ji-Hyun; Lee, Sangyong; Choi, SeokJoo; Choi, Yoon-Hee; Lee, Kwansub
2017-03-01
[Purpose] The purpose of this study was to identify the effects of extracorporeal shock wave therapy on the pain and function of patients with degenerative knee arthritis. [Subjects and Methods] Twenty patients with degenerative knee arthritis were divided into a conservative physical therapy group (n=10) and an extracorporeal shock wave therapy group (n=10). Both groups received general conservative physical therapy, and the extracorporeal shock wave therapy was additionally treated with extracorporeal shock wave therapy after receiving conservative physical therapy. Both groups were treated three times a week over a four-week period. The visual analogue scale was used to evaluate pain in the knee joints of the subjects, and the Korean Western Ontario and McMaster Universities Osteoarthritis Index was used to evaluate the function of the subjects. [Results] The comparison of the visual analogue scale and Korean Western Ontario and McMaster Universities Osteoarthritis Index scores within each group before and after the treatment showed statistically significant declines in scores in both the conservative physical therapy group and extracorporeal shock wave therapy group. A group comparison after the treatment showed statistically significant differences in these scores in the extracorporeal shock wave therapy group and the conservative physical therapy group. [Conclusion] extracorporeal shock wave therapy may be a useful nonsurgical intervention for reducing the pain of patients with degenerative knee arthritis and improving these patients' function.
MR Guided PET Image Reconstruction
Bai, Bing; Li, Quanzheng; Leahy, Richard M.
2013-01-01
The resolution of PET images is limited by the physics of positron-electron annihilation and instrumentation for photon coincidence detection. Model based methods that incorporate accurate physical and statistical models have produced significant improvements in reconstructed image quality when compared to filtered backprojection reconstruction methods. However, it has often been suggested that by incorporating anatomical information, the resolution and noise properties of PET images could be improved, leading to better quantitation or lesion detection. With the recent development of combined MR-PET scanners, it is possible to collect intrinsically co-registered MR images. It is therefore now possible to routinely make use of anatomical information in PET reconstruction, provided appropriate methods are available. In this paper we review research efforts over the past 20 years to develop these methods. We discuss approaches based on the use of both Markov random field priors and joint information or entropy measures. The general framework for these methods is described and their performance and longer term potential and limitations discussed. PMID:23178087
Physical Activity and Alcohol Use Disorders
Lisha, Nadra E.; Sussman, Steve; FAPA, FAAHB; Leventhal, Adam M.
2013-01-01
Background Prior research has documented a counterintuitive positive association between physical activity and indices of alcohol consumption frequency and heaviness. Objectives To investigate whether this relation extends to alcohol use disorder and clarify whether this association is non-linear. Methods This is a cross-sectional, correlational population-based study of US adults (N = 34,653). The Alcohol Use Disorder and Associated Disabilities Interview Schedule was used to classify past-year DSM-IV alcohol use disorder and self-reported federal government-recommended weekly physical activity cutoffs. Results After statistically controlling for confounds, alcohol abuse but not dependence was associated with greater prevalence of physical activity. Number of alcohol use disorder symptoms exhibited a curvilinear relationship with meeting physical activity requirements, such that the positive association degraded with high symptom counts. Conclusion There is a positive association between physical activity and less severe forms of alcohol use disorder in US adults. More severe forms of alcohol use disorder are not associated with physical activity. PMID:22992050
Robust approaches to quantification of margin and uncertainty for sparse data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin
Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less
Experimental Determination of Dynamical Lee-Yang Zeros
NASA Astrophysics Data System (ADS)
Brandner, Kay; Maisi, Ville F.; Pekola, Jukka P.; Garrahan, Juan P.; Flindt, Christian
2017-05-01
Statistical physics provides the concepts and methods to explain the phase behavior of interacting many-body systems. Investigations of Lee-Yang zeros—complex singularities of the free energy in systems of finite size—have led to a unified understanding of equilibrium phase transitions. The ideas of Lee and Yang, however, are not restricted to equilibrium phenomena. Recently, Lee-Yang zeros have been used to characterize nonequilibrium processes such as dynamical phase transitions in quantum systems after a quench or dynamic order-disorder transitions in glasses. Here, we experimentally realize a scheme for determining Lee-Yang zeros in such nonequilibrium settings. We extract the dynamical Lee-Yang zeros of a stochastic process involving Andreev tunneling between a normal-state island and two superconducting leads from measurements of the dynamical activity along a trajectory. From the short-time behavior of the Lee-Yang zeros, we predict the large-deviation statistics of the activity which is typically difficult to measure. Our method paves the way for further experiments on the statistical mechanics of many-body systems out of equilibrium.
Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics
NASA Astrophysics Data System (ADS)
Lazarus, S. M.; Holman, B. P.; Splitt, M. E.
2017-12-01
A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.
The Gaussian CL s method for searches of new physics
Qian, X.; Tan, A.; Ling, J. J.; ...
2016-04-23
Here we describe a method based on the CL s approach to present results in searches of new physics, under the condition that the relevant parameter space is continuous. Our method relies on a class of test statistics developed for non-nested hypotheses testing problems, denoted by ΔT, which has a Gaussian approximation to its parent distribution when the sample size is large. This leads to a simple procedure of forming exclusion sets for the parameters of interest, which we call the Gaussian CL s method. Our work provides a self-contained mathematical proof for the Gaussian CL s method, that explicitlymore » outlines the required conditions. These conditions are milder than that required by the Wilks' theorem to set confidence intervals (CIs). We illustrate the Gaussian CL s method in an example of searching for a sterile neutrino, where the CL s approach was rarely used before. We also compare data analysis results produced by the Gaussian CL s method and various CI methods to showcase their differences.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elwell, M.R.
1996-03-01
;Contents: Introduction (Physical and Chemical Properties, Production, Use, and Exposure, Disposition and Metabolism, Toxicity, Study Rationale and Design); Materials and Methods (Procurement and Characterization of 0-Nitroluene and o-Toluidine Hydrochloride, Preparation and Analysis of Dose Formulations, Preparation of Antibiotic Mixture, Toxicity Study Designs, Statistical Methods, Quality Assurance); Results (26-Week Feed Studies in Male F344/N Rats).
An Integrated, Statistical Molecular Approach to the Physical Chemistry Curriculum
ERIC Educational Resources Information Center
Cartier, Stephen F.
2009-01-01
As an alternative to the "thermodynamics first" or "quantum first" approaches to the physical chemistry curriculum, the statistical definition of entropy and the Boltzmann distribution are introduced in the first days of the course and the entire two-semester curriculum is then developed from these concepts. Once the tools of statistical mechanics…
Maintained Physical Activity Induced Changes in Delay Discounting.
Sofis, Michael J; Carrillo, Ale; Jarmolowicz, David P
2017-07-01
Those who discount the subjective value of delayed rewards less steeply are more likely to engage in physical activity. There is limited research, however, showing whether physical activity can change rates of delay discounting. In a two-experiment series, treatment and maintenance effects of a novel, effort-paced physical activity intervention on delay discounting were evaluated with multiple baseline designs. Using a lap-based method, participants were instructed to exercise at individualized high and low effort levels and to track their own perceived effort. The results suggest that treatment-induced changes in discounting were maintained at follow-up for 13 of 16 participants. In Experiment 2, there were statistically significant group-level improvements in physical activity and delay discounting when comparing baseline with both treatment and maintenance phases. Percentage change in delay discounting was significantly correlated with session attendance and relative pace (min/mile) improvement over the course of the 7-week treatment. Implications for future research are discussed.
A new statistical approach to climate change detection and attribution
NASA Astrophysics Data System (ADS)
Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe
2017-01-01
We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).
Quantum theory of multiscale coarse-graining.
Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A
2018-03-14
Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.
Inactivation of dust mites, dust mite allergen, and mold from carpet.
Ong, Kee-Hean; Lewis, Roger D; Dixit, Anupma; MacDonald, Maureen; Yang, Mingan; Qian, Zhengmin
2014-01-01
Carpet is known to be a reservoir for biological contaminants, such as dust mites, dust mite allergen, and mold, if it is not kept clean. The accumulation of these contaminants in carpet might trigger allergies or asthma symptoms in both children and adults. The purpose of this study is to compare methods for removal of dust mites, dust mite allergens, and mold from carpet. Carpets were artificially worn to simulate 1 to 2 years of wear in a four-person household. The worn carpets were inoculated together with a common indoor mold (Cladosporium species) and house dust mites and incubated for 6 weeks to allow time for dust mite growth on the carpet. The carpets were randomly assigned to one of the four treatment groups. Available treatment regimens for controlling carpet contaminants were evaluated through a literature review and experimentation. Four moderately low-hazard, nondestructive methods were selected as treatments: vacuuming, steam-vapor, Neem oil (a natural tree extract), and benzalkonium chloride (a quaternary ammonium compound). Steam vapor treatment demonstrated the greatest dust mite population reduction (p < 0.05) when compared to other methods. The two physical methods, steam vapor and vacuuming, have no statistically significant efficacy in inactivating dust mite allergens (p = 0.084), but have higher efficacy when compared to the chemical method on dust mite allergens (p = 0.002). There is no statistically significant difference in the efficacy for reducing mold in carpet (p > 0.05) for both physical and chemical methods. The steam-vapor treatment effectively killed dust mites and denatured dust mite allergen in the laboratory environment.
Sports practice is related to parasympathetic activity in adolescents
Cayres, Suziane Ungari; Vanderlei, Luiz Carlos Marques; Rodrigues, Aristides Machado; Coelho e Silva, Manuel João; Codogno, Jamile Sanches; Barbosa, Maurício Fregonesi; Fernandes, Rômulo Araújo
2015-01-01
OBJECTIVE: To analyze the relationship among sports practice, physical education class, habitual physical activity and cardiovascular risk in adolescents. METHODS: Cross-sectional study with 120 schoolchildren (mean: 11.7±0.7 years old), with no regular use of medicines. Sports practice and physical education classes were assessed through face-to-face interview, while habitual physical activity was assessed by pedometers. Bodyweight, height and height-cephalic trunk were used to estimate maturation. The following variables were measured: body fatness, blood pressure, resting heart rate, blood flow velocity, intima-media thickness (carotid and femoral) and heart rate variability (mean between consecutive heartbeats and statistical index in the time domain that show the autonomic parasympathetic nervous system activity root-mean by the square of differences between adjacent normal R-R intervals in a time interval). Statistical treatment used Spearman correlation adjusted by sex, ethnicity, age, body fatness and maturation. RESULTS: Independently of potential confounders, sports practice was positively related to autonomic parasympathetic nervous system activity (β=0.039 [0.01; 0.76]). On the other hand, the relationship between sport practice and mean between consecutive heartbeats (β=0,031 [-0.01; 0.07]) was significantly mediated by biological maturation. CONCLUSIONS: Sport practice was related to higher heart rate variability at rest. PMID:25887927
Effects of preprocessing Landsat MSS data on derived features
NASA Technical Reports Server (NTRS)
Parris, T. M.; Cicone, R. C.
1983-01-01
Important to the use of multitemporal Landsat MSS data for earth resources monitoring, such as agricultural inventories, is the ability to minimize the effects of varying atmospheric and satellite viewing conditions, while extracting physically meaningful features from the data. In general, the approaches to the preprocessing problem have been derived from either physical or statistical models. This paper compares three proposed algorithms; XSTAR haze correction, Color Normalization, and Multiple Acquisition Mean Level Adjustment. These techniques represent physical, statistical, and hybrid physical-statistical models, respectively. The comparisons are made in the context of three feature extraction techniques; the Tasseled Cap, the Cate Color Cube. and Normalized Difference.
Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun
2018-05-01
Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.
A Bayesian Approach to Evaluating Consistency between Climate Model Output and Observations
NASA Astrophysics Data System (ADS)
Braverman, A. J.; Cressie, N.; Teixeira, J.
2010-12-01
Like other scientific and engineering problems that involve physical modeling of complex systems, climate models can be evaluated and diagnosed by comparing their output to observations of similar quantities. Though the global remote sensing data record is relatively short by climate research standards, these data offer opportunities to evaluate model predictions in new ways. For example, remote sensing data are spatially and temporally dense enough to provide distributional information that goes beyond simple moments to allow quantification of temporal and spatial dependence structures. In this talk, we propose a new method for exploiting these rich data sets using a Bayesian paradigm. For a collection of climate models, we calculate posterior probabilities its members best represent the physical system each seeks to reproduce. The posterior probability is based on the likelihood that a chosen summary statistic, computed from observations, would be obtained when the model's output is considered as a realization from a stochastic process. By exploring how posterior probabilities change with different statistics, we may paint a more quantitative and complete picture of the strengths and weaknesses of the models relative to the observations. We demonstrate our method using model output from the CMIP archive, and observations from NASA's Atmospheric Infrared Sounder.
Statistical physics approach to quantifying differences in myelinated nerve fibers
Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene
2014-01-01
We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross–sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum. PMID:24676146
Statistical physics approach to quantifying differences in myelinated nerve fibers
NASA Astrophysics Data System (ADS)
Comin, César H.; Santos, João R.; Corradini, Dario; Morrison, Will; Curme, Chester; Rosene, Douglas L.; Gabrielli, Andrea; da F. Costa, Luciano; Stanley, H. Eugene
2014-03-01
We present a new method to quantify differences in myelinated nerve fibers. These differences range from morphologic characteristics of individual fibers to differences in macroscopic properties of collections of fibers. Our method uses statistical physics tools to improve on traditional measures, such as fiber size and packing density. As a case study, we analyze cross-sectional electron micrographs from the fornix of young and old rhesus monkeys using a semi-automatic detection algorithm to identify and characterize myelinated axons. We then apply a feature selection approach to identify the features that best distinguish between the young and old age groups, achieving a maximum accuracy of 94% when assigning samples to their age groups. This analysis shows that the best discrimination is obtained using the combination of two features: the fraction of occupied axon area and the effective local density. The latter is a modified calculation of axon density, which reflects how closely axons are packed. Our feature analysis approach can be applied to characterize differences that result from biological processes such as aging, damage from trauma or disease or developmental differences, as well as differences between anatomical regions such as the fornix and the cingulum bundle or corpus callosum.
A Comparison of Student Understanding of Seasons Using Inquiry and Didactic Teaching Methods
NASA Astrophysics Data System (ADS)
Ashcraft, Paul G.
2006-02-01
Student performance on open-ended questions concerning seasons in a university physical science content course was examined to note differences between classes that experienced inquiry using a 5-E lesson planning model and those that experienced the same content with a traditional, didactic lesson. The class examined is a required content course for elementary education majors and understanding the seasons is part of the university's state's elementary science standards. The two self-selected groups of students showed no statistically significant differences in pre-test scores, while there were statistically significant differences between the groups' post-test scores with those who participated in inquiry-based activities scoring higher. There were no statistically significant differences between the pre-test and the post-test for the students who experienced didactic teaching, while there were statistically significant improvements for the students who experienced the 5-E lesson.
plasma column and observed the interesting phenomenon of plasma ejection. At FUB, Balescu and Prigogine direct a group of sixty theoreticians doing...outstanding work in statistical physics. Balescu is writing another graduate textbook on non-equilibrium statistical mechanics. He is tackling the
Physics in Perspective Volume II, Part C, Statistical Data.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC. Physics Survey Committee.
Statistical data relating to the sociology and economics of the physics enterprise are presented and explained. The data are divided into three sections: manpower data, data on funding and costs, and data on the literature of physics. Each section includes numerous studies, with notes on the sources and types of data, gathering procedures, and…
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
Health-related quality of life and related factors of military police officers
2014-01-01
Purpose The present study aimed to determine the effect of demographic characteristics, occupation, anthropometric indices, and leisure-time physical activity levels on coronary risk and health-related quality of life among military police officers from the State of Santa Catarina, Brazil. Methods The sample included 165 military police officers who fulfilled the study’s inclusion criteria. The International Physical Activity Questionnaire and the Short Form Health Survey were used, in addition to a spreadsheet of socio-demographic, occupational and anthropometric data. Statistical analyses were performed using descriptive analysis followed by Spearman Correlation and multiple linear regression analysis using the backward method. Results The waist-to-height ratio was identified as a risk factor low health-related quality of life. In addition, the conicity index, fat percentage, years of service in the military police, minutes of work per day and leisure-time physical activity levels were identified as risk factors for coronary disease among police officers. Conclusions These findings suggest that the Military Police Department should adopt an institutional policy that allows police officers to practice regular physical activity in order to maintain and improve their physical fitness, health, job performance, and quality of life. PMID:24766910
Lindemann histograms as a new method to analyse nano-patterns and phases
NASA Astrophysics Data System (ADS)
Makey, Ghaith; Ilday, Serim; Tokel, Onur; Ibrahim, Muhamet; Yavuz, Ozgun; Pavlov, Ihor; Gulseren, Oguz; Ilday, Omer
The detection, observation, and analysis of material phases and atomistic patterns are of great importance for understanding systems exhibiting both equilibrium and far-from-equilibrium dynamics. As such, there is intense research on phase transitions and pattern dynamics in soft matter, statistical and nonlinear physics, and polymer physics. In order to identify phases and nano-patterns, the pair correlation function is commonly used. However, this approach is limited in terms of recognizing competing patterns in dynamic systems, and lacks visualisation capabilities. In order to solve these limitations, we introduce Lindemann histogram quantification as an alternative method to analyse solid, liquid, and gas phases, along with hexagonal, square, and amorphous nano-pattern symmetries. We show that the proposed approach based on Lindemann parameter calculated per particle maps local number densities to material phase or particles pattern. We apply the Lindemann histogram method on dynamical colloidal self-assembly experimental data and identify competing patterns.
Statistics of dislocation pinning at localized obstacles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dutta, A.; Bhattacharya, M., E-mail: mishreyee@vecc.gov.in; Barat, P.
2014-10-14
Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning ofmore » dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.« less
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Gentili, Stefania
2017-04-01
Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.
Wood, T J; Beavis, A W; Saunderson, J R
2013-01-01
Objective: The purpose of this study was to examine the correlation between the quality of visually graded patient (clinical) chest images and a quantitative assessment of chest phantom (physical) images acquired with a computed radiography (CR) imaging system. Methods: The results of a previously published study, in which four experienced image evaluators graded computer-simulated postero-anterior chest images using a visual grading analysis scoring (VGAS) scheme, were used for the clinical image quality measurement. Contrast-to-noise ratio (CNR) and effective dose efficiency (eDE) were used as physical image quality metrics measured in a uniform chest phantom. Although optimal values of these physical metrics for chest radiography were not derived in this work, their correlation with VGAS in images acquired without an antiscatter grid across the diagnostic range of X-ray tube voltages was determined using Pearson’s correlation coefficient. Results: Clinical and physical image quality metrics increased with decreasing tube voltage. Statistically significant correlations between VGAS and CNR (R=0.87, p<0.033) and eDE (R=0.77, p<0.008) were observed. Conclusion: Medical physics experts may use the physical image quality metrics described here in quality assurance programmes and optimisation studies with a degree of confidence that they reflect the clinical image quality in chest CR images acquired without an antiscatter grid. Advances in knowledge: A statistically significant correlation has been found between the clinical and physical image quality in CR chest imaging. The results support the value of using CNR and eDE in the evaluation of quality in clinical thorax radiography. PMID:23568362
Radon anomalies: When are they possible to be detected?
NASA Astrophysics Data System (ADS)
Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik
2017-04-01
Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission rate, but the results are strongly dependent on the length of the time window and/or type of frequency filtering. More importantly, when raw time-series contain cyclic components (e.g. seasonal or diurnal variation), the quest of anomalies related to transients becomes meaningless. We conclude that an objective identification of transient changes can be performed only after filtering the raw time-series for the physically meaningful frequency content.
NASA Astrophysics Data System (ADS)
Simoni, Daniele; Lengani, Davide; Guida, Roberto
2016-09-01
The transition process of the boundary layer growing over a flat plate with pressure gradient simulating the suction side of a low-pressure turbine blade and elevated free-stream turbulence intensity level has been analyzed by means of PIV and hot-wire measurements. A detailed view of the instantaneous flow field in the wall-normal plane highlights the physics characterizing the complex process leading to the formation of large-scale coherent structures during breaking down of the ordered motion of the flow, thus generating randomized oscillations (i.e., turbulent spots). This analysis gives the basis for the development of a new procedure aimed at determining the intermittency function describing (statistically) the transition process. To this end, a wavelet-based method has been employed for the identification of the large-scale structures created during the transition process. Successively, a probability density function of these events has been defined so that an intermittency function is deduced. This latter strictly corresponds to the intermittency function of the transitional flow computed trough a classic procedure based on hot-wire data. The agreement between the two procedures in the intermittency shape and spot production rate proves the capability of the method in providing the statistical representation of the transition process. The main advantages of the procedure here proposed concern with its applicability to PIV data; it does not require a threshold level to discriminate first- and/or second-order time-derivative of hot-wire time traces (that makes the method not influenced by the operator); and it provides a clear evidence of the connection between the flow physics and the statistical representation of transition based on theory of turbulent spot propagation.
Cafaro, Carlo; Alsing, Paul M
2018-04-01
The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.
NASA Astrophysics Data System (ADS)
Cafaro, Carlo; Alsing, Paul M.
2018-04-01
The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.
Shen, Tongye; Gnanakaran, S
2009-04-22
A critical roadblock to the production of biofuels from lignocellulosic biomass is the efficient degradation of crystalline microfibrils of cellulose to glucose. A microscopic understanding of how different physical conditions affect the overall stability of the crystalline structure of microfibrils could facilitate the design of more effective protocols for their degradation. One of the essential physical interactions that stabilizes microfibrils is a network of hydrogen (H) bonds: both intrachain H-bonds between neighboring monomers of a single cellulose polymer chain and interchain H-bonds between adjacent chains. We construct a statistical mechanical model of cellulose assembly at the resolution of explicit hydrogen-bond networks. Using the transfer matrix method, the partition function and the subsequent statistical properties are evaluated. With the help of this lattice-based model, we capture the plasticity of the H-bond network in cellulose due to frustration and redundancy in the placement of H-bonds. This plasticity is responsible for the stability of cellulose over a wide range of temperatures. Stable intrachain and interchain H-bonds are identified as a function of temperature that could possibly be manipulated toward rational destruction of crystalline cellulose.
Quantum Field Theory Approach to Condensed Matter Physics
NASA Astrophysics Data System (ADS)
Marino, Eduardo C.
2017-09-01
Preface; Part I. Condensed Matter Physics: 1. Independent electrons and static crystals; 2. Vibrating crystals; 3. Interacting electrons; 4. Interactions in action; Part II. Quantum Field Theory: 5. Functional formulation of quantum field theory; 6. Quantum fields in action; 7. Symmetries: explicit or secret; 8. Classical topological excitations; 9. Quantum topological excitations; 10. Duality, bosonization and generalized statistics; 11. Statistical transmutation; 12. Pseudo quantum electrodynamics; Part III. Quantum Field Theory Approach to Condensed Matter Systems: 13. Quantum field theory methods in condensed matter; 14. Metals, Fermi liquids, Mott and Anderson insulators; 15. The dynamics of polarons; 16. Polyacetylene; 17. The Kondo effect; 18. Quantum magnets in 1D: Fermionization, bosonization, Coulomb gases and 'all that'; 19. Quantum magnets in 2D: nonlinear sigma model, CP1 and 'all that'; 20. The spin-fermion system: a quantum field theory approach; 21. The spin glass; 22. Quantum field theory approach to superfluidity; 23. Quantum field theory approach to superconductivity; 24. The cuprate high-temperature superconductors; 25. The pnictides: iron based superconductors; 26. The quantum Hall effect; 27. Graphene; 28. Silicene and transition metal dichalcogenides; 29. Topological insulators; 30. Non-abelian statistics and quantum computation; References; Index.
A study of correlations between crude oil spot and futures markets: A rolling sample test
NASA Astrophysics Data System (ADS)
Liu, Li; Wan, Jieqiu
2011-10-01
In this article, we investigate the asymmetries of exceedance correlations and cross-correlations between West Texas Intermediate (WTI) spot and futures markets. First, employing the test statistic proposed by Hong et al. [Asymmetries in stock returns: statistical tests and economic evaluation, Review of Financial Studies 20 (2007) 1547-1581], we find that the exceedance correlations were overall symmetric. However, the results from rolling windows show that some occasional events could induce the significant asymmetries of the exceedance correlations. Second, employing the test statistic proposed by Podobnik et al. [Quantifying cross-correlations using local and global detrending approaches, European Physics Journal B 71 (2009) 243-250], we find that the cross-correlations were significant even for large lagged orders. Using the detrended cross-correlation analysis proposed by Podobnik and Stanley [Detrended cross-correlation analysis: a new method for analyzing two nonstationary time series, Physics Review Letters 100 (2008) 084102], we find that the cross-correlations were weakly persistent and were stronger between spot and futures contract with larger maturity. Our results from rolling sample test also show the apparent effects of the exogenous events. Additionally, we have some relevant discussions on the obtained evidence.
Lee, Jong Hwa; Kim, Sang Beom; Lee, Kyeong Woo; Lee, Sook Joung; Park, Hyuntae; Kim, Dong Won
2017-09-01
The use of a whole-body vibration (WBV) therapy has recently been applied and investigated as a rehabilitation method for subacute stroke patients. To evaluate the effects of a WBV therapy on recovery of balance in subacute stroke patients who were unable to gain sitting balance. The conventional rehabilitation group (CG) received conventional physical therapy, including sitting balance training by a physical therapist, for 30 min a one session, for twice a day for five days a week for two weeks. The whole-body vibration group (VG) received one session of conventional physical therapy, and received WBV therapy instead of conventional physical therapy for 30 min a day for five days a week for two weeks. There were 15 patients in the CG and 15 patients in the VG who completed the two-week therapy. After the two-week therapy, both groups showed functional improvement. Patients in the VG improved functional ambulation categories, Berg balance scale, trunk impairment scale scores. But, no statistically significant correlations between the therapeutic methods and outcomes were observed in either group. Our results suggest that WBV therapy led to improvement of the recovery in balance recovery for subacute stroke patients. Because the WBV therapy was as effective as conventional physical therapy, we can consider a WBV therapy as a clinical method to improve the sitting balance of subacute stoke patients.
Best-Practice Physical Activity Programs for Older Adults: Findings From the National Impact Study
Seymour, Rachel B.; Campbell, Richard T.; Whitelaw, Nancy; Bazzarre, Terry
2009-01-01
Objectives. We assessed the impact of existing best-practice physical activity programs for older adults on physical activity participation and health-related outcomes. Methods. We used a multisite, randomized trial with 544 older adults (mean age 66 years) and measures at baseline, 5, and 10 months to test the impact of a multiple-component physical activity program compared with results for a control group that did not participate in such a program. Results. For adults who participated in a multiple-component physical activity program, we found statistically significant benefits at 5 and 10 months with regard to self-efficacy for exercise adherence over time (P < .001), adherence in the face of barriers (P = .01), increased upper- and lower-body strength (P = .02, P = .01), and exercise participation (P = .01). Conclusions. Best-practice community-based physical activity programs can measurably improve aspects of functioning that are risk factors for disability among older adults. US public policy should encourage these inexpensive health promotion programs. PMID:19059858
Bringing Clouds into Our Lab! - The Influence of Turbulence on the Early Stage Rain Droplets
NASA Astrophysics Data System (ADS)
Yavuz, Mehmet Altug; Kunnen, Rudie; Heijst, Gertjan; Clercx, Herman
2015-11-01
We are investigating a droplet-laden flow in an air-filled turbulence chamber, forced by speaker-driven air jets. The speakers are running in a random manner; yet they allow us to control and define the statistics of the turbulence. We study the motion of droplets with tunable size (Stokes numbers ~ 0.13 - 9) in a turbulent flow, mimicking the early stages of raindrop formation. 3D Particle Tracking Velocimetry (PTV) together with Laser Induced Fluorescence (LIF) methods are chosen as the experimental method to track the droplets and collect data for statistical analysis. Thereby it is possible to study the spatial distribution of the droplets in turbulence using the so-called Radial Distribution Function (RDF), a statistical measure to quantify the clustering of particles. Additionally, 3D-PTV technique allows us to measure velocity statistics of the droplets and the influence of the turbulence on droplet trajectories, both individually and collectively. In this contribution, we will present the clustering probability quantified by the RDF for different Stokes numbers. We will explain the physics underlying the influence of turbulence on droplet cluster behavior. This study supported by FOM/NWO Netherlands.
On the statistical properties and tail risk of violent conflicts
NASA Astrophysics Data System (ADS)
Cirillo, Pasquale; Taleb, Nassim Nicholas
2016-06-01
We examine statistical pictures of violent conflicts over the last 2000 years, providing techniques for dealing with the unreliability of historical data. We make use of a novel approach to deal with fat-tailed random variables with a remote but nonetheless finite upper bound, by defining a corresponding unbounded dual distribution (given that potential war casualties are bounded by the world population). This approach can also be applied to other fields of science where power laws play a role in modeling, like geology, hydrology, statistical physics and finance. We apply methods from extreme value theory on the dual distribution and derive its tail properties. The dual method allows us to calculate the real tail mean of war casualties, which proves to be considerably larger than the corresponding sample mean for large thresholds, meaning severe underestimation of the tail risks of conflicts from naive observation. We analyze the robustness of our results to errors in historical reports. We study inter-arrival times between tail events and find that no particular trend can be asserted. All the statistical pictures obtained are at variance with the prevailing claims about ;long peace;, namely that violence has been declining over time.
A method for obtaining a statistically stationary turbulent free shear flow
NASA Technical Reports Server (NTRS)
Timson, Stephen F.; Lele, S. K.; Moser, R. D.
1994-01-01
The long-term goal of the current research is the study of Large-Eddy Simulation (LES) as a tool for aeroacoustics. New algorithms and developments in computer hardware are making possible a new generation of tools for aeroacoustic predictions, which rely on the physics of the flow rather than empirical knowledge. LES, in conjunction with an acoustic analogy, holds the promise of predicting the statistics of noise radiated to the far-field of a turbulent flow. LES's predictive ability will be tested through extensive comparison of acoustic predictions based on a Direct Numerical Simulation (DNS) and LES of the same flow, as well as a priori testing of DNS results. The method presented here is aimed at allowing simulation of a turbulent flow field that is both simple and amenable to acoustic predictions. A free shear flow is homogeneous in both the streamwise and spanwise directions and which is statistically stationary will be simulated using equations based on the Navier-Stokes equations with a small number of added terms. Studying a free shear flow eliminates the need to consider flow-surface interactions as an acoustic source. The homogeneous directions and the flow's statistically stationary nature greatly simplify the application of an acoustic analogy.
Statistical issues in searches for new phenomena in High Energy Physics
NASA Astrophysics Data System (ADS)
Lyons, Louis; Wardle, Nicholas
2018-03-01
Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.
NASA Astrophysics Data System (ADS)
McLaughlin, P. W.; Kaihatu, J. M.; Irish, J. L.; Taylor, N. R.; Slinn, D.
2013-12-01
Recent hurricane activity in the Gulf of Mexico has led to a need for accurate, computationally efficient prediction of hurricane damage so that communities can better assess risk of local socio-economic disruption. This study focuses on developing robust, physics based non-dimensional equations that accurately predict maximum significant wave height at different locations near a given hurricane track. These equations (denoted as Wave Response Functions, or WRFs) were developed from presumed physical dependencies between wave heights and hurricane characteristics and fit with data from numerical models of waves and surge under hurricane conditions. After curve fitting, constraints which correct for fully developed sea state were used to limit the wind wave growth. When applied to the region near Gulfport, MS, back prediction of maximum significant wave height yielded root mean square errors between 0.22-0.42 (m) at open coast stations and 0.07-0.30 (m) at bay stations when compared to the numerical model data. The WRF method was also applied to Corpus Christi, TX and Panama City, FL with similar results. Back prediction errors will be included in uncertainty evaluations connected to risk calculations using joint probability methods. These methods require thousands of simulations to quantify extreme value statistics, thus requiring the use of reduced methods such as the WRF to represent the relevant physical processes.
Application of physical scaling towards downscaling climate model precipitation data
NASA Astrophysics Data System (ADS)
Gaur, Abhishek; Simonovic, Slobodan P.
2018-04-01
Physical scaling (SP) method downscales climate model data to local or regional scales taking into consideration physical characteristics of the area under analysis. In this study, multiple SP method based models are tested for their effectiveness towards downscaling North American regional reanalysis (NARR) daily precipitation data. Model performance is compared with two state-of-the-art downscaling methods: statistical downscaling model (SDSM) and generalized linear modeling (GLM). The downscaled precipitation is evaluated with reference to recorded precipitation at 57 gauging stations located within the study region. The spatial and temporal robustness of the downscaling methods is evaluated using seven precipitation based indices. Results indicate that SP method-based models perform best in downscaling precipitation followed by GLM, followed by the SDSM model. Best performing models are thereafter used to downscale future precipitations made by three global circulation models (GCMs) following two emission scenarios: representative concentration pathway (RCP) 2.6 and RCP 8.5 over the twenty-first century. The downscaled future precipitation projections indicate an increase in mean and maximum precipitation intensity as well as a decrease in the total number of dry days. Further an increase in the frequency of short (1-day), moderately long (2-4 day), and long (more than 5-day) precipitation events is projected.
The Pilates method and cardiorespiratory adaptation to training.
Tinoco-Fernández, Maria; Jiménez-Martín, Miguel; Sánchez-Caravaca, M Angeles; Fernández-Pérez, Antonio M; Ramírez-Rodrigo, Jesús; Villaverde-Gutiérrez, Carmen
2016-01-01
Although all authors report beneficial health changes following training based on the Pilates method, no explicit analysis has been performed of its cardiorespiratory effects. The objective of this study was to evaluate possible changes in cardiorespiratory parameters with the Pilates method. A total of 45 university students aged 18-35 years (77.8% female and 22.2% male), who did not routinely practice physical exercise or sports, volunteered for the study and signed informed consent. The Pilates training was conducted over 10 weeks, with three 1-hour sessions per week. Physiological cardiorespiratory responses were assessed using a MasterScreen CPX apparatus. After the 10-week training, statistically significant improvements were observed in mean heart rate (135.4-124.2 beats/min), respiratory exchange ratio (1.1-0.9) and oxygen equivalent (30.7-27.6) values, among other spirometric parameters, in submaximal aerobic testing. These findings indicate that practice of the Pilates method has a positive influence on cardiorespiratory parameters in healthy adults who do not routinely practice physical exercise activities.
NASA Astrophysics Data System (ADS)
Dufoyer, A.; Lecoq, N.; Massei, N.; Marechal, J. C.
2017-12-01
Physics-based modeling of karst systems remains almost impossible without enough accurate information about the inner physical characteristics. Usually, the only available hydrodynamic information is the flow rate at the karst outlet. Numerous works in the past decades have used and proven the usefulness of time-series analysis and spectral techniques applied to spring flow, precipitations or even physico-chemical parameters, for interpreting karst hydrological functioning. However, identifying or interpreting the karst systems physical features that control statistical or spectral characteristics of spring flow variations is still challenging, not to say sometimes controversial. The main objective of this work is to determine how the statistical and spectral characteristics of the hydrodynamic signal at karst springs can be related to inner physical and hydraulic properties. In order to address this issue, we undertake an empirical approach based on the use of both distributed and physics-based models, and on synthetic systems responses. The first step of the research is to conduct a sensitivity analysis of time-series/spectral methods to karst hydraulic and physical properties. For this purpose, forward modeling of flow through several simple, constrained and synthetic cases in response to precipitations is undertaken. It allows us to quantify how the statistical and spectral characteristics of flow at the outlet are sensitive to changes (i) in conduit geometries, and (ii) in hydraulic parameters of the system (matrix/conduit exchange rate, matrix hydraulic conductivity and storativity). The flow differential equations resolved by MARTHE, a computer code developed by the BRGM, allows karst conduits modeling. From signal processing on simulated spring responses, we hope to determine if specific frequencies are always modified, thanks to Fourier series and multi-resolution analysis. We also hope to quantify which parameters are the most variable with auto-correlation analysis: first results seem to show higher variations due to conduit conductivity than the ones due to matrix/conduit exchange rate. Future steps will be using another computer code, based on double-continuum approach and allowing turbulent conduit flow, and modeling a natural system.
Moore, Michael D; Shi, Zhenqi; Wildfong, Peter L D
2010-12-01
To develop a method for drawing statistical inferences from differences between multiple experimental pair distribution function (PDF) transforms of powder X-ray diffraction (PXRD) data. The appropriate treatment of initial PXRD error estimates using traditional error propagation algorithms was tested using Monte Carlo simulations on amorphous ketoconazole. An amorphous felodipine:polyvinyl pyrrolidone:vinyl acetate (PVPva) physical mixture was prepared to define an error threshold. Co-solidified products of felodipine:PVPva and terfenadine:PVPva were prepared using a melt-quench method and subsequently analyzed using PXRD and PDF. Differential scanning calorimetry (DSC) was used as an additional characterization method. The appropriate manipulation of initial PXRD error estimates through the PDF transform were confirmed using the Monte Carlo simulations for amorphous ketoconazole. The felodipine:PVPva physical mixture PDF analysis determined ±3σ to be an appropriate error threshold. Using the PDF and error propagation principles, the felodipine:PVPva co-solidified product was determined to be completely miscible, and the terfenadine:PVPva co-solidified product, although having appearances of an amorphous molecular solid dispersion by DSC, was determined to be phase-separated. Statistically based inferences were successfully drawn from PDF transforms of PXRD patterns obtained from composite systems. The principles applied herein may be universally adapted to many different systems and provide a fundamentally sound basis for drawing structural conclusions from PDF studies.
Egorov, A D; Stepantsov, V I; Nosovskiĭ, A M; Shipov, A A
2009-01-01
Cluster analysis was applied to evaluate locomotion training (running and running intermingled with walking) of 13 cosmonauts on long-term ISS missions by the parameters of duration (min), distance (m) and intensity (km/h). Based on the results of analyses, the cosmonauts were distributed into three steady groups of 2, 5 and 6 persons. Distance and speed showed a statistical rise (p < 0.03) from group 1 to group 3. Duration of physical locomotion training was not statistically different in the groups (p = 0.125). Therefore, cluster analysis is an adequate method of evaluating fitness of cosmonauts on long-term missions.
Numerically exact full counting statistics of the nonequilibrium Anderson impurity model
NASA Astrophysics Data System (ADS)
Ridley, Michael; Singh, Viveka N.; Gull, Emanuel; Cohen, Guy
2018-03-01
The time-dependent full counting statistics of charge transport through an interacting quantum junction is evaluated from its generating function, controllably computed with the inchworm Monte Carlo method. Exact noninteracting results are reproduced; then, we continue to explore the effect of electron-electron interactions on the time-dependent charge cumulants, first-passage time distributions, and n -electron transfer distributions. We observe a crossover in the noise from Coulomb blockade to Kondo-dominated physics as the temperature is decreased. In addition, we uncover long-tailed spin distributions in the Kondo regime and analyze queuing behavior caused by correlations between single-electron transfer events.
A Ground Flash Fraction Retrieval Algorithm for GLM
NASA Technical Reports Server (NTRS)
Koshak, William J.
2010-01-01
A Bayesian inversion method is introduced for retrieving the fraction of ground flashes in a set of N lightning observed by a satellite lightning imager (such as the Geostationary Lightning Mapper, GLM). An exponential model is applied as a physically reasonable constraint to describe the measured lightning optical parameter distributions. Population statistics (i.e., the mean and variance) are invoked to add additional constraints to the retrieval process. The Maximum A Posteriori (MAP) solution is employed. The approach is tested by performing simulated retrievals, and retrieval error statistics are provided. The approach is feasible for N greater than 2000, and retrieval errors decrease as N is increased.
Numerically exact full counting statistics of the nonequilibrium Anderson impurity model
Ridley, Michael; Singh, Viveka N.; Gull, Emanuel; ...
2018-03-06
The time-dependent full counting statistics of charge transport through an interacting quantum junction is evaluated from its generating function, controllably computed with the inchworm Monte Carlo method. Exact noninteracting results are reproduced; then, we continue to explore the effect of electron-electron interactions on the time-dependent charge cumulants, first-passage time distributions, and n-electron transfer distributions. We observe a crossover in the noise from Coulomb blockade to Kondo-dominated physics as the temperature is decreased. In addition, we uncover long-tailed spin distributions in the Kondo regime and analyze queuing behavior caused by correlations between single-electron transfer events
Numerically exact full counting statistics of the nonequilibrium Anderson impurity model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ridley, Michael; Singh, Viveka N.; Gull, Emanuel
The time-dependent full counting statistics of charge transport through an interacting quantum junction is evaluated from its generating function, controllably computed with the inchworm Monte Carlo method. Exact noninteracting results are reproduced; then, we continue to explore the effect of electron-electron interactions on the time-dependent charge cumulants, first-passage time distributions, and n-electron transfer distributions. We observe a crossover in the noise from Coulomb blockade to Kondo-dominated physics as the temperature is decreased. In addition, we uncover long-tailed spin distributions in the Kondo regime and analyze queuing behavior caused by correlations between single-electron transfer events
Haranas, Ioannis; Gkigkitzis, Ioannis; Kotsireas, Ilias; Austerlitz, Carlos
2017-01-01
Understanding how the brain encodes information and performs computation requires statistical and functional analysis. Given the complexity of the human brain, simple methods that facilitate the interpretation of statistical correlations among different brain regions can be very useful. In this report we introduce a numerical correlation measure that may serve the interpretation of correlational neuronal data, and may assist in the evaluation of different brain states. The description of the dynamical brain system, through a global numerical measure may indicate the presence of an action principle which may facilitate a application of physics principles in the study of the human brain and cognition.
A Transferrable Belief Model Representation for Physical Security of Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Gerts
This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less
NASA Astrophysics Data System (ADS)
Trecia Markes, Cecelia
2006-03-01
With a three-year FIPSE grant, it has been possible at the University of Nebraska at Kearney (UNK) to develop and implement activity- based introductory physics at the algebra level. It has generally been recognized that students enter physics classes with misconceptions about motion and force. Many of these misconceptions persist after instruction. Pretest and posttest responses on the ``Force and Motion Conceptual Evaluation'' (FMCE) are analyzed to determine the effectiveness of the activity- based method of instruction relative to the traditional (lecture/lab) method of instruction. Data were analyzed to determine the following: student understanding at the beginning of the course, student understanding at the end of the course, how student understanding is related to the type of class taken, student understanding based on gender and type of class. Some of the tests used are the t-test, the chi-squared test, and analysis of variance. The results of these tests will be presented, and their implications will be discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perelson, A.S.; Weisbuch, G.
1997-10-01
The immune system is a complex system of cells and molecules that can provide us with a basic defense against pathogenic organisms. Like the nervous system, the immune system performs pattern recognition tasks, learns, and retains a memory of the antigens that it has fought. The immune system contains more than 10{sup 7} different clones of cells that communicate via cell-cell contact and the secretion of molecules. Performing complex tasks such as learning and memory involves cooperation among large numbers of components of the immune system and hence there is interest in using methods and concepts from statistical physics. Furthermore,more » the immune response develops in time and the description of its time evolution is an interesting problem in dynamical systems. In this paper, the authors provide a brief introduction to the biology of the immune system and discuss a number of immunological problems in which the use of physical concepts and mathematical methods has increased our understanding. {copyright} {ital 1997} {ital The American Physical Society}« less
Waltman, Ludo; van Raan, Anthony F J; Smart, Sue
2014-01-01
We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the 'EPS-HLS interface' is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade.
Waltman, Ludo; van Raan, Anthony F. J.; Smart, Sue
2014-01-01
We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the ‘EPS-HLS interface’ is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade. PMID:25360616
Statistical physics in foreign exchange currency and stock markets
NASA Astrophysics Data System (ADS)
Ausloos, M.
2000-09-01
Problems in economy and finance have attracted the interest of statistical physicists all over the world. Fundamental problems pertain to the existence or not of long-, medium- or/and short-range power-law correlations in various economic systems, to the presence of financial cycles and on economic considerations, including economic policy. A method like the detrended fluctuation analysis is recalled emphasizing its value in sorting out correlation ranges, thereby leading to predictability at short horizon. The ( m, k)-Zipf method is presented for sorting out short-range correlations in the sign and amplitude of the fluctuations. A well-known financial analysis technique, the so-called moving average, is shown to raise questions to physicists about fractional Brownian motion properties. Among spectacular results, the possibility of crash predictions has been demonstrated through the log-periodicity of financial index oscillations.
Volatility behavior of visibility graph EMD financial time series from Ising interacting system
NASA Astrophysics Data System (ADS)
Zhang, Bo; Wang, Jun; Fang, Wen
2015-08-01
A financial market dynamics model is developed and investigated by stochastic Ising system, where the Ising model is the most popular ferromagnetic model in statistical physics systems. Applying two graph based analysis and multiscale entropy method, we investigate and compare the statistical volatility behavior of return time series and the corresponding IMF series derived from the empirical mode decomposition (EMD) method. And the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, we find that the degree distribution of visibility graph for the simulation series has the power law tails, and the assortative network exhibits the mixing pattern property. All these features are in agreement with the real market data, the research confirms that the financial model established by the Ising system is reasonable.
NASA Astrophysics Data System (ADS)
Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming
2018-04-01
Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.
Statistical physics of human beings in games: Controlled experiments
NASA Astrophysics Data System (ADS)
Liang, Yuan; Huang, Ji-Ping
2014-07-01
It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.
Graphene growth process modeling: a physical-statistical approach
NASA Astrophysics Data System (ADS)
Wu, Jian; Huang, Qiang
2014-09-01
As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.
Bockting, Walter; Rosenblum, Andrew; Hwahng, Sel; Mason, Mona; Macri, Monica; Becker, Jeffrey
2014-01-01
Objectives. We examined the social and interpersonal context of gender abuse and its effects on Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition major depression among transgender women. Methods. We conducted a 3-year prospective study (2004–2007) among 230 transgender women aged 19 to 59 years from the New York City Metropolitan Area. Statistical techniques included generalized estimating equations (logistic regression). Results. We observed significant associations of psychological and physical gender abuse with major depression during follow-up. New or persistent experiences of both types of abuse were associated with 4- to 7-fold increases in the likelihood of incident major depression. Employment, transgender presentation, sex work, and hormone therapy correlated across time with psychological abuse; the latter 2 variables correlated with physical abuse. The association of psychological abuse with depression was stronger among younger than among older transgender women. Conclusions. Psychological and physical gender abuse is endemic in this population and may result from occupational success and attempts to affirm gender identity. Both types of abuse have serious mental health consequences in the form of major depression. Older transgender women have apparently developed some degree of resilience to psychological gender abuse. PMID:24328655
Phase transitions in models of human cooperation
NASA Astrophysics Data System (ADS)
Perc, Matjaž
2016-08-01
If only the fittest survive, why should one cooperate? Why should one sacrifice personal benefits for the common good? Recent research indicates that a comprehensive answer to such questions requires that we look beyond the individual and focus on the collective behavior that emerges as a result of the interactions among individuals, groups, and societies. Although undoubtedly driven also by culture and cognition, human cooperation is just as well an emergent, collective phenomenon in a complex system. Nonequilibrium statistical physics, in particular the collective behavior of interacting particles near phase transitions, has already been recognized as very valuable for understanding counterintuitive evolutionary outcomes. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among humans often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. Here we briefly review research done in the realm of the public goods game, and we outline future research directions with an emphasis on merging the most recent advances in the social sciences with methods of nonequilibrium statistical physics. By having a firm theoretical grip on human cooperation, we can hope to engineer better social systems and develop more efficient policies for a sustainable and better future.
Statistical deprojection of galaxy pairs
NASA Astrophysics Data System (ADS)
Nottale, Laurent; Chamaraux, Pierre
2018-06-01
Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.
Army Logistician. Volume 39, Issue 1, January-February 2007
2007-02-01
of electronic systems using statistical methods. P& C , however, requires advanced prognostic capabilities not only to detect the early onset of...patterns. Entities operating in a P& C -enabled environment will sense and understand contextual meaning , communicate their state and mission, and act to...accessing of historical and simulation patterns; on- board prognostics capabilities; physics of failure analyses; and predictive modeling. P& C also
Mean-field approximation for spacing distribution functions in classical systems
NASA Astrophysics Data System (ADS)
González, Diego Luis; Pimpinelli, Alberto; Einstein, T. L.
2012-01-01
We propose a mean-field method to calculate approximately the spacing distribution functions p(n)(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p(n)(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed.
Re-entry survivability and risk
NASA Astrophysics Data System (ADS)
Fudge, Michael L.
1998-11-01
This paper is the culmination of the research effort which was reported on last year while still in-progress. As previously reported, statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by reentering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was demonstrated in dramatic fashion in January 1997 by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This report details reentry survivability estimation methodology, including the specific methodology used by ITT Systems' (formerly Kaman Sciences) 'SURVIVE' model. The major change to the model in the last twelve months has been the increase in the fidelity with which upper- atmospheric aerodynamics has been modeled. This has resulted in an adjustment in the factor relating the amount of kinetic energy loss to the amount of heating entering and reentering body, and also validated and removed the necessity for certain empirically-based adjustments made to the theoretical heating expressions. Comparisons between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and SURVIVE estimates are presented for selected generic upper stage or spacecraft components, a Soyuz launch vehicle second stage, and for a Delta II launch vehicle second stage and its significant components. Significant similarity is demonstrated between the type and dispersion pattern of the recovered debris from the January 1997 Delta II 2nd stage event and the simulation of that reentry and breakup.
Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model
NASA Astrophysics Data System (ADS)
Advani, Madhu; Bunin, Guy; Mehta, Pankaj
2018-03-01
A central question in ecology is to understand the ecological processes that shape community structure. Niche-based theories have emphasized the important role played by competition for maintaining species diversity. Many of these insights have been derived using MacArthur’s consumer resource model (MCRM) or its generalizations. Most theoretical work on the MCRM has focused on small ecosystems with a few species and resources. However theoretical insights derived from small ecosystems many not scale up to large ecosystems with many resources and species because large systems with many interacting components often display new emergent behaviors that cannot be understood or deduced from analyzing smaller systems. To address these shortcomings, we develop a statistical physics inspired cavity method to analyze MCRM when both the number of species and the number of resources is large. Unlike previous work in this limit, our theory addresses resource dynamics and resource depletion and demonstrates that species generically and consistently perturb their environments and significantly modify available ecological niches. We show how our cavity approach naturally generalizes niche theory to large ecosystems by accounting for the effect of collective phenomena on species invasion and ecological stability. Our theory suggests that such phenomena are a generic feature of large, natural ecosystems and must be taken into account when analyzing and interpreting community structure. It also highlights the important role that statistical-physics inspired approaches can play in furthering our understanding of ecology.
ERIC Educational Resources Information Center
Merner, Laura; Tyler, John
2017-01-01
Using the National Center of Education Statistics' Integrated Postsecondary Education Data System (IPEDS), this report analyzes data on Native American recipients of bachelor's degrees among 16 physical science and engineering fields. Overall, Native Americans are earning physical science and engineering bachelor's degrees at lower rates than the…
Choi, Mi-Ri; Jeon, Sang-Wan; Yi, Eun-Surk
2018-04-01
The purpose of this study is to analyze the differences among the hospitalized cancer patients on their perception of exercise and physical activity constraints based on their medical history. The study used questionnaire survey as measurement tool for 194 cancer patients (male or female, aged 20 or older) living in Seoul metropolitan area (Seoul, Gyeonggi, Incheon). The collected data were analyzed using frequency analysis, exploratory factor analysis, reliability analysis t -test, and one-way distribution using statistical program SPSS 18.0. The following results were obtained. First, there was no statistically significant difference between cancer stage and exercise recognition/physical activity constraint. Second, there was a significant difference between cancer stage and sociocultural constraint/facility constraint/program constraint. Third, there was a significant difference between cancer operation history and physical/socio-cultural/facility/program constraint. Fourth, there was a significant difference between cancer operation history and negative perception/facility/program constraint. Fifth, there was a significant difference between ancillary cancer treatment method and negative perception/facility/program constraint. Sixth, there was a significant difference between hospitalization period and positive perception/negative perception/physical constraint/cognitive constraint. In conclusion, this study will provide information necessary to create patient-centered healthcare service system by analyzing exercise recognition of hospitalized cancer patients based on their medical history and to investigate the constraint factors that prevents patients from actually making efforts to exercise.
Regional projection of climate impact indices over the Mediterranean region
NASA Astrophysics Data System (ADS)
Casanueva, Ana; Frías, M.; Dolores; Herrera, Sixto; Bedia, Joaquín; San Martín, Daniel; Gutiérrez, José Manuel; Zaninovic, Ksenija
2014-05-01
Climate Impact Indices (CIIs) are being increasingly used in different socioeconomic sectors to transfer information about climate change impacts and risks to stakeholders. CIIs are typically based on different weather variables such as temperature, wind speed, precipitation or humidity and comprise, in a single index, the relevant meteorological information for the particular impact sector (in this study wildfires and tourism). This dependence on several climate variables poses important limitations to the application of statistical downscaling techniques, since physical consistency among variables is required in most cases to obtain reliable local projections. The present study assesses the suitability of the "direct" downscaling approach, in which the downscaling method is directly applied to the CII. In particular, for illustrative purposes, we consider two popular indices used in the wildfire and tourism sectors, the Fire Weather Index (FWI) and the Physiological Equivalent Temperature (PET), respectively. As an example, two case studies are analysed over two representative Mediterranean regions of interest for the EU CLIM-RUN project: continental Spain for the FWI and Croatia for the PET. Results obtained with this "direct" downscaling approach are similar to those found from the application of the statistical downscaling to the individual meteorological drivers prior to the index calculation ("component" downscaling) thus, a wider range of statistical downscaling methods could be used. As an illustration, future changes in both indices are projected by applying two direct statistical downscaling methods, analogs and linear regression, to the ECHAM5 model. Larger differences were found between the two direct statistical downscaling approaches than between the direct and the component approaches with a single downscaling method. While these examples focus on particular indices and Mediterranean regions of interest for CLIM-RUN stakeholders, the same study could be extended to other indices and regions.
Usov, Ivan; Nyström, Gustav; Adamcik, Jozef; Handschin, Stephan; Schütz, Christina; Fall, Andreas; Bergström, Lennart; Mezzenga, Raffaele
2015-01-01
Nanocellulose fibrils are ubiquitous in nature and nanotechnologies but their mesoscopic structural assembly is not yet fully understood. Here we study the structural features of rod-like cellulose nanoparticles on a single particle level, by applying statistical polymer physics concepts on electron and atomic force microscopy images, and we assess their physical properties via quantitative nanomechanical mapping. We show evidence of right-handed chirality, observed on both bundles and on single fibrils. Statistical analysis of contours from microscopy images shows a non-Gaussian kink angle distribution. This is inconsistent with a structure consisting of alternating amorphous and crystalline domains along the contour and supports process-induced kink formation. The intrinsic mechanical properties of nanocellulose are extracted from nanoindentation and persistence length method for transversal and longitudinal directions, respectively. The structural analysis is pushed to the level of single cellulose polymer chains, and their smallest associated unit with a proposed 2 × 2 chain-packing arrangement. PMID:26108282
Statistical sensitivity analysis of a simple nuclear waste repository model
NASA Astrophysics Data System (ADS)
Ronen, Y.; Lucius, J. L.; Blow, E. M.
1980-06-01
A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.
StreamStats: a U.S. geological survey web site for stream information
Kernell, G. Ries; Gray, John R.; Renard, Kenneth G.; McElroy, Stephen A.; Gburek, William J.; Canfield, H. Evan; Scott, Russell L.
2003-01-01
The U.S. Geological Survey has developed a Web application, named StreamStats, for providing streamflow statistics, such as the 100-year flood and the 7-day, 10-year low flow, to the public. Statistics can be obtained for data-collection stations and for ungaged sites. Streamflow statistics are needed for water-resources planning and management; for design of bridges, culverts, and flood-control structures; and for many other purposes. StreamStats users can point and click on data-collection stations shown on a map in their Web browser window to obtain previously determined streamflow statistics and other information for the stations. Users also can point and click on any stream shown on the map to get estimates of streamflow statistics for ungaged sites. StreamStats determines the watershed boundaries and measures physical and climatic characteristics of the watersheds for the ungaged sites by use of a Geographic Information System (GIS), and then it inserts the characteristics into previously determined regression equations to estimate the streamflow statistics. Compared to manual methods, StreamStats reduces the average time needed to estimate streamflow statistics for ungaged sites from several hours to several minutes.
Associations between Physical and Cognitive Doping – A Cross-Sectional Study in 2.997 Triathletes
Dietz, Pavel; Ulrich, Rolf; Dalaker, Robert; Striegel, Heiko; Franke, Andreas G.; Lieb, Klaus; Simon, Perikles
2013-01-01
Purpose This study assessed, for the first time, prevalence estimates for physical and cognitive doping within a single collective of athletes using the randomized response technique (RRT). Furthermore, associations between the use of legal and freely available substances to improve physical and cognitive performance (enhancement) and illicit or banned substances to improve physical and cognitive performance (doping) were examined. Methods An anonymous questionnaire using the unrelated question RRT was used to survey 2,997 recreational triathletes in three sports events (Frankfurt, Regensburg, and Wiesbaden) in Germany. Prior to the survey, statistical power analyses were performed to determine sample size. Logistic regression was used to predict physical and cognitive enhancement and the bootstrap method was used to evaluate differences between the estimated prevalences of physical and cognitive doping. Results 2,987 questionnaires were returned (99.7%). 12-month prevalences for physical and cognitive doping were 13.0% and 15.1%, respectively. The prevalence estimate for physical doping was significantly higher in athletes who also used physical enhancers, as well as in athletes who took part in the European Championship in Frankfurt compared to those who did not. The prevalence estimate for cognitive doping was significantly higher in athletes who also used physical and cognitive enhancers. Moreover, the use of physical and cognitive enhancers were significantly associated and also the use of physical and cognitive doping. Discussion The use of substances to improve physical and cognitive performance was associated on both levels of legality (enhancement vs. doping) suggesting that athletes do not use substances for a specific goal but may have a general propensity to enhance. This finding is important for understanding why people use such substances. Consequently, more effective prevention programs against substance abuse and doping could be developed. PMID:24236038
Gorgon, Edward James R; Basco, Mark David S; Manuel, Almira T
2013-11-22
Early education on the foundations of evidence based practice (EBP) is advocated as a potent intervention toward enhancing EBP uptake among physical therapists. Little is known about the extent to which EBP is integrated in educational curricula in developing countries where the benefits of EBP are more acutely needed. This study sought to describe EBP education in Philippine physical therapy schools, including the challenges encountered by educators in teaching EBP. A national survey of higher education institutions offering an undergraduate degree program in physical therapy was conducted from August 2011 through January 2012. A 35-item questionnaire was developed to gather data on whether or not EBP was taught, specific EBP content covered and courses in which content was covered, teaching and evaluation methods, and challenges in teaching EBP. Data were analyzed descriptively. The study had a response rate of 55.7% (34/61). Majority of the participating educational institutions (82%, 28/34) reported teaching EBP by incorporating EBP content in the professional courses. Among those that did not teach EBP, inadequate educator competence was the leading barrier. Courses commonly used to teach EBP were those on research (78.6%, 22/28), therapy planning (71.4%, 20/28), treatment skills (57.1-64.3%, 16-18/28), and undergraduate thesis (60.7%, 17/28). Various EBP contents were covered, with statistical concepts more frequently taught compared with critical EBP content. Lectures and journal reports were the usual teaching methods (96.4%, 27/28 and 89.3%, 25/28, respectively) while written examinations, completion of an undergraduate thesis, and oral reports (82.1%, 23/28, 78.6%, 22/28, and 78.6%, 22/28, respectively) were often used in evaluation. Students' inadequate knowledge of statistics and lack of curricular structure for EBP were identified as leading challenges to teaching (75%, 21/28 and 50%, 14/28, respectively). Many physical therapy faculties across the Philippines are incorporating EBP content in teaching. However, there is arbitrary and fragmented coverage of EBP content and inadequate emphasis on clinically oriented teaching-learning and assessment methods. These findings suggest the need to design appropriate entry-level educational programs on EBP. Effective 'educating the educators' strategies are urgently needed and can have far-reaching positive repercussions on EBP uptake in physical therapist practice.
Probing the space of toric quiver theories
NASA Astrophysics Data System (ADS)
Hewlett, Joseph; He, Yang-Hui
2010-03-01
We demonstrate a practical and efficient method for generating toric Calabi-Yau quiver theories, applicable to both D3 and M2 brane world-volume physics. A new analytic method is presented at low order parametres and an algorithm for the general case is developed which has polynomial complexity in the number of edges in the quiver. Using this algorithm, carefully implemented, we classify the quiver diagram and assign possible superpotentials for various small values of the number of edges and nodes. We examine some preliminary statistics on this space of toric quiver theories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, J.; Hoversten, G.M.
2011-09-15
Joint inversion of seismic AVA and CSEM data requires rock-physics relationships to link seismic attributes to electrical properties. Ideally, we can connect them through reservoir parameters (e.g., porosity and water saturation) by developing physical-based models, such as Gassmann’s equations and Archie’s law, using nearby borehole logs. This could be difficult in the exploration stage because information available is typically insufficient for choosing suitable rock-physics models and for subsequently obtaining reliable estimates of the associated parameters. The use of improper rock-physics models and the inaccuracy of the estimates of model parameters may cause misleading inversion results. Conversely, it is easy tomore » derive statistical relationships among seismic and electrical attributes and reservoir parameters from distant borehole logs. In this study, we develop a Bayesian model to jointly invert seismic AVA and CSEM data for reservoir parameter estimation using statistical rock-physics models; the spatial dependence of geophysical and reservoir parameters are carried out by lithotypes through Markov random fields. We apply the developed model to a synthetic case, which simulates a CO{sub 2} monitoring application. We derive statistical rock-physics relations from borehole logs at one location and estimate seismic P- and S-wave velocity ratio, acoustic impedance, density, electrical resistivity, lithotypes, porosity, and water saturation at three different locations by conditioning to seismic AVA and CSEM data. Comparison of the inversion results with their corresponding true values shows that the correlation-based statistical rock-physics models provide significant information for improving the joint inversion results.« less
A statistical physics perspective on criticality in financial markets
NASA Astrophysics Data System (ADS)
Bury, Thomas
2013-11-01
Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.
Healing X-ray scattering images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jiliang; Lhermitte, Julien; Tian, Ye
X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less
Healing X-ray scattering images
Liu, Jiliang; Lhermitte, Julien; Tian, Ye; ...
2017-05-24
X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less
Multiscale solvers and systematic upscaling in computational physics
NASA Astrophysics Data System (ADS)
Brandt, A.
2005-07-01
Multiscale algorithms can overcome the scale-born bottlenecks that plague most computations in physics. These algorithms employ separate processing at each scale of the physical space, combined with interscale iterative interactions, in ways which use finer scales very sparingly. Having been developed first and well known as multigrid solvers for partial differential equations, highly efficient multiscale techniques have more recently been developed for many other types of computational tasks, including: inverse PDE problems; highly indefinite (e.g., standing wave) equations; Dirac equations in disordered gauge fields; fast computation and updating of large determinants (as needed in QCD); fast integral transforms; integral equations; astrophysics; molecular dynamics of macromolecules and fluids; many-atom electronic structures; global and discrete-state optimization; practical graph problems; image segmentation and recognition; tomography (medical imaging); fast Monte-Carlo sampling in statistical physics; and general, systematic methods of upscaling (accurate numerical derivation of large-scale equations from microscopic laws).
NASA Astrophysics Data System (ADS)
Siegel, Edward
2011-04-01
Numbers: primality/indivisibility/non-factorization versus compositeness/divisibility /factor-ization, often in tandem but not always, provocatively close analogy to nuclear-physics: (2 + 1)=(fusion)=3; (3+1)=(fission)=4[=2 x 2]; (4+1)=(fusion)=5; (5+1)=(fission)=6[=2 x 3]; (6 + 1)=(fusion)=7; (7+1)=(fission)=8[= 2 x 4 = 2 x 2 x 2]; (8 + 1) =(non: fission nor fusion)= 9[=3 x 3]; then ONLY composites' Islands of fusion-INstability: 8, 9, 10; then 14, 15, 16,... Could inter-digit Feshbach-resonances exist??? Applications to: quantum-information and computing non-Shore factorization, millennium-problem Riemann-hypotheses physics-proof as numbers/digits Goodkin Bose-Einstein Condensation intersection with graph-theory ``short-cut'' method: Rayleigh(1870)-Polya(1922)-``Anderson'' (1958)-localization, Goldbach-conjecture, financial auditing/accounting as quantum-statistical-physics;... abound!!!
Complex systems: physics beyond physics
NASA Astrophysics Data System (ADS)
Holovatch, Yurij; Kenna, Ralph; Thurner, Stefan
2017-03-01
Complex systems are characterised by specific time-dependent interactions among their many constituents. As a consequence they often manifest rich, non-trivial and unexpected behaviour. Examples arise both in the physical and non-physical worlds. The study of complex systems forms a new interdisciplinary research area that cuts across physics, biology, ecology, economics, sociology, and the humanities. In this paper we review the essence of complex systems from a physicists' point of view, and try to clarify what makes them conceptually different from systems that are traditionally studied in physics. Our goal is to demonstrate how the dynamics of such systems may be conceptualised in quantitative and predictive terms by extending notions from statistical physics and how they can often be captured in a framework of co-evolving multiplex network structures. We mention three areas of complex-systems science that are currently studied extensively, the science of cities, dynamics of societies, and the representation of texts as evolutionary objects. We discuss why these areas form complex systems in the above sense. We argue that there exists plenty of new ground for physicists to explore and that methodical and conceptual progress is needed most.
Monte Carlo modeling of spatial coherence: free-space diffraction
Fischer, David G.; Prahl, Scott A.; Duncan, Donald D.
2008-01-01
We present a Monte Carlo method for propagating partially coherent fields through complex deterministic optical systems. A Gaussian copula is used to synthesize a random source with an arbitrary spatial coherence function. Physical optics and Monte Carlo predictions of the first- and second-order statistics of the field are shown for coherent and partially coherent sources for free-space propagation, imaging using a binary Fresnel zone plate, and propagation through a limiting aperture. Excellent agreement between the physical optics and Monte Carlo predictions is demonstrated in all cases. Convergence criteria are presented for judging the quality of the Monte Carlo predictions. PMID:18830335
Strategies for Dealing with Missing Accelerometer Data.
Stephens, Samantha; Beyene, Joseph; Tremblay, Mark S; Faulkner, Guy; Pullnayegum, Eleanor; Feldman, Brian M
2018-05-01
Missing data is a universal research problem that can affect studies examining the relationship between physical activity measured with accelerometers and health outcomes. Statistical techniques are available to deal with missing data; however, available techniques have not been synthesized. A scoping review was conducted to summarize the advantages and disadvantages of identified methods of dealing with missing data from accelerometers. Missing data poses a threat to the validity and interpretation of trials using physical activity data from accelerometry. Imputation using multiple imputation techniques is recommended to deal with missing data and improve the validity and interpretation of studies using accelerometry. Copyright © 2018 Elsevier Inc. All rights reserved.
Baryon interactions from lattice QCD with physical masses —S = -3 sector: Ξ∑ and Ξ∑-Λ∑—
NASA Astrophysics Data System (ADS)
Ishii, Noriyoshi; Aoki, Sinya; Doi, Takumi; Gongyo, Shinya; Hatsuda, Tetsuo; Ikeda, Yoichi; Inoue, Takashi; Iritani, Takumi; Miyamoto, Takaya; Nemura, Hidekatsu; Sasaki, Kenji
2018-03-01
Hyperon-nucleon and hyperon-hyperon interactions are important in studying the properties of hypernuclei in hypernuclear physics. However, unlike the nucleons which are quite stable, hyperons are unstable so that the direct scattering experiments are difficult, which leads to the large uncertainty in the phenomenological determination of hyperon potentials. In this talk, we use the gauge configurations generated at the (almost) physical point (mπ = 146 MeV) on a huge spatial volume (8:1fm)4 to present our latest result on the hyperon-hyperon potentials in S = -3 sector (Ξ∑ single channel and Ξ∑- ΞΛ; coupled channel) from the Nambu-Bethe-Salpeter wave functions based on the HAL QCD method with improved statistics.
The eating habits of Patients with Type 2 diabetes in Algeria
Laissaoui, Aicha; Allem, Rachida
2016-01-01
Objective: To evaluate the eating habits and the practice of physical-activity of patients with Tyhpe-2 diabetes. (DT2). Methods: A total of 1523 patients DT2 with average age 58±9.9 were recruited. A questionnaire about their eating habits, physical activity was conducted. Data were analyzed using SPSS statistical. Results: Most of the patients were obese (64%), with irregular and weak practice of the physical-activity. The patients based their consumption on food rich with nutrients of high glycaemic index. Their food was mainly characterized by high amounts of fats, the green salads and the desserts (fruits) represent only a secondary amount. Statistically, Overweight + obese patients with diabetes had significantly higher level of consumption of the bread. However, the normal weight patients with diabetes had significantly higher level of the consumption of fruit and vegetables (p=0.006 and p=0 respectively). On the other hand, there was no significant difference in level of the consumption of the greasy substances, milk and dairy products, meat-fish-egg two groups (p=0.53, p=0.06 and P > 0.05). Conclusion: This study showed the need for an improvement in the nutritional care of DT2 patients in the area of Ain Defla (Algeria), also, the practice of the physical-activity, in order to plan an adequate therapeutic care. PMID:27182225
Siting stability in skeletally mature patients with scoliosis and myelomeningocele.
Bartnicki, Bartłomiej; Synder, Marek; Kujawa, Jolanta; Stańczak, Katarzyna; Sibiński, Marcin
2012-01-01
The purpose of the study was to assess the influence of sitting stability in skeletally mature patients on their quality of life and general physical function. We also aimed to assess the relationship between sitting balance and the severity of scoliosis or other disorders of individuals with myelomeningocoele. The prospective study enrolled 19 patients with a mean age of 21.4 years (min. 13 years). Patients treated operatively for spinal deformity were excluded from the study. Different aspects of the quality of life were assessed with several questionnaires serving to measure overall quality of life, general physical function, self-perception and self-motivation as well as dysfunction related to spine deformity. Walking ability was assessed according to the Hoffer classification and the level of motor neuron injury was evaluated with the International Myelodysplasia Study Protocol. Statistical analysis showed that sitting stability assessed by examiners or parents positively correlated with overall quality of life, general physical function, pelvic obliquity measured by Osebold method, and the level of motor spine dysfunction. It was not related to self-perception and self-motivation of patients. There was no statistical correlation between sitting balance and the Cobb angle, walking ability, presence of pressure sores and age. The value of the Cobb angle is not a good indicator of sitting balance in patients with scoliosis and myelomeningocoele. Stabile sitting is related to better overall quality of life and physical function.
Improving the Validity and Reliability of a Health Promotion Survey for Physical Therapists
Stephens, Jaca L.; Lowman, John D.; Graham, Cecilia L.; Morris, David M.; Kohler, Connie L.; Waugh, Jonathan B.
2013-01-01
Purpose Physical therapists (PTs) have a unique opportunity to intervene in the area of health promotion. However, no instrument has been validated to measure PTs’ views on health promotion in physical therapy practice. The purpose of this study was to evaluate the content validity and test-retest reliability of a health promotion survey designed for PTs. Methods An expert panel of PTs assessed the content validity of “The Role of Health Promotion in Physical Therapy Survey” and provided suggestions for revision. Item content validity was assessed using the content validity ratio (CVR) as well as the modified kappa statistic. Therapists then participated in the test-retest reliability assessment of the revised health promotion survey, which was assessed using a weighted kappa statistic. Results Based on feedback from the expert panelists, significant revisions were made to the original survey. The expert panel reached at least a majority consensus agreement for all items in the revised survey and the survey-CVR improved from 0.44 to 0.66. Only one item on the revised survey had substantial test-retest agreement, with 55% of the items having moderate agreement and 43% poor agreement. Conclusions All items on the revised health promotion survey demonstrated at least fair validity, but few items had reasonable test-retest reliability. Further modifications should be made to strengthen the validity and improve the reliability of this survey. PMID:23754935
The Problem Solving Method in Teaching Physics in Elementary School
NASA Astrophysics Data System (ADS)
Jandrić, Gordana Hajduković; Obadović, Dušanka Ž.; Stojanović, Maja
2010-01-01
The most of the teachers ask if there is a "best" known way to teach. The most effective teaching method depends on the specific goals of the course and the needs of the students. An investigation has been carried out to compare the effect of teaching selected physics topics using problem-solving method on the overall achievements of the acquired knowledge and teaching the same material by traditional teaching method. The investigation was performed as a pedagogical experiment of the type of parallel groups with randomly chosen sample of students attending grades eight. The control and experimental groups were equalized in the relevant pedagogical parameters. The obtained results were treated statistically. The comparison showed a significant difference in respect of the speed of acquiring knowledge, the problem-solving teaching being advantageous over traditional methodDo not replace the word "abstract," but do replace the rest of this text. If you must insert a hard line break, please use Shift+Enter rather than just tapping your "Enter" key. You may want to print this page and refer to it as a style sample before you begin working on your paper.
MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.
Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk
2018-05-29
Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.
Electromagnetic sinc Schell-model beams and their statistical properties.
Mei, Zhangrong; Mao, Yonghua
2014-09-22
A class of electromagnetic sources with sinc Schell-model correlations is introduced. The conditions on source parameters guaranteeing that the source generates a physical beam are derived. The evolution behaviors of statistical properties for the electromagnetic stochastic beams generated by this new source on propagating in free space and in atmosphere turbulence are investigated with the help of the weighted superposition method and by numerical simulations. It is demonstrated that the intensity distributions of such beams exhibit unique features on propagating in free space and produce a double-layer flat-top profile of being shape-invariant in the far field. This feature makes this new beam particularly suitable for some special laser processing applications. The influences of the atmosphere turbulence with a non-Kolmogorov power spectrum on statistical properties of the new beams are analyzed in detail.
[The impact of social and hygienic lifestyle factors on health status of students].
Sakharova, O B; Kiku, P F; Gorborukova, T V
2012-01-01
The complex estimation of the impact of socio-hygienic lifestyle factors on the health of students has been performed. In the work the data of sociological analysis (questionnaire), the methods of multivariate statistics (correlation, regression analysis, method of correlation pleiades by P. V. Terentiev) were used. Among the analyzed components the average monthly income was found to make the greatest contribution of the health state and physical capacity of the studied contingent of students. The influence of this factor is most pronounced in a group of students with an average wealth. The quality of nutrition and the mode of life depend on the level of material well-being of students. Students with a deficiency or excess body weight are more susceptible to the effects of such lifestyle factors such as nutrition, physical activity, bad habits and prosperity.
O'Neill, B; McDonough, S M; Wilson, J J; Bradbury, I; Hayes, K; Kirk, A; Kent, L; Cosgrove, D; Bradley, J M; Tully, M A
2017-01-14
There are challenges for researchers and clinicians to select the most appropriate physical activity tool, and a balance between precision and feasibility is needed. Currently it is unclear which physical activity tool should be used to assess physical activity in Bronchiectasis. The aim of this research is to compare assessment methods (pedometer and IPAQ) to our criterion method (ActiGraph) for the measurement of physical activity dimensions in Bronchiectasis (BE), and to assess their feasibility and acceptability. Patients in this analysis were enrolled in a cross-sectional study. The ActiGraph and pedometer were worn for seven consecutive days and the IPAQ was completed for the same period. Statistical analyses were performed using SPSS 20 (IBM). Descriptive statistics were used; the percentage agreement between ActiGraph and the other measures were calculated using limits of agreement. Feedback about the feasibility of the activity monitors and the IPAQ was obtained. There were 55 (22 male) data sets available. For step count there was no significant difference between the ActiGraph and Pedometer, however, total physical activity time (mins) as recorded by the ActiGraph was significantly higher than the pedometer (mean ± SD, 232 (75) vs. 63 (32)). Levels of agreement between the two devices was very good for step count (97% agreement); and variation in the levels of agreement were within accepted limits of ±2 standard deviations from the mean value. IPAQ reported more bouted- moderate - vigorous physical activity (MVPA) [mean, SD; 167(170) vs 6(9) mins/day], and significantly less sedentary time than ActiGraph [mean, SD; 362(115) vs 634(76) vmins/day]. There were low levels of agreement between the two tools (57% sedentary behaviour; 0% MVPA 10+ ), with IPAQ under-reporting sedentary behaviour and over-reporting MVPA 10+ compared to ActiGraph. The monitors were found to be feasible and acceptable by participants and researchers; while the IPAQ was accepta ble to use, most patients required assistance to complete it. Accurate measurement of physical activity is feasible in BE and will be valuable for future trials of therapeutic interventions. ActiGraph or pedometer could be used to measure simple daily step counts, but ActiGraph was superior as it measured intensity of physical activity and was a more precise measure of time spent walking. The IPAQ does not appear to represent an accurate measure of physical activity in this population. Clinical Trials Registration Number NCT01569009 : Physical Activity in Bronchiectasis.
Simulating Metabolism with Statistical Thermodynamics
Cannon, William R.
2014-01-01
New methods are needed for large scale modeling of metabolism that predict metabolite levels and characterize the thermodynamics of individual reactions and pathways. Current approaches use either kinetic simulations, which are difficult to extend to large networks of reactions because of the need for rate constants, or flux-based methods, which have a large number of feasible solutions because they are unconstrained by the law of mass action. This report presents an alternative modeling approach based on statistical thermodynamics. The principles of this approach are demonstrated using a simple set of coupled reactions, and then the system is characterized with respect to the changes in energy, entropy, free energy, and entropy production. Finally, the physical and biochemical insights that this approach can provide for metabolism are demonstrated by application to the tricarboxylic acid (TCA) cycle of Escherichia coli. The reaction and pathway thermodynamics are evaluated and predictions are made regarding changes in concentration of TCA cycle intermediates due to 10- and 100-fold changes in the ratio of NAD+:NADH concentrations. Finally, the assumptions and caveats regarding the use of statistical thermodynamics to model non-equilibrium reactions are discussed. PMID:25089525
Simulating metabolism with statistical thermodynamics.
Cannon, William R
2014-01-01
New methods are needed for large scale modeling of metabolism that predict metabolite levels and characterize the thermodynamics of individual reactions and pathways. Current approaches use either kinetic simulations, which are difficult to extend to large networks of reactions because of the need for rate constants, or flux-based methods, which have a large number of feasible solutions because they are unconstrained by the law of mass action. This report presents an alternative modeling approach based on statistical thermodynamics. The principles of this approach are demonstrated using a simple set of coupled reactions, and then the system is characterized with respect to the changes in energy, entropy, free energy, and entropy production. Finally, the physical and biochemical insights that this approach can provide for metabolism are demonstrated by application to the tricarboxylic acid (TCA) cycle of Escherichia coli. The reaction and pathway thermodynamics are evaluated and predictions are made regarding changes in concentration of TCA cycle intermediates due to 10- and 100-fold changes in the ratio of NAD+:NADH concentrations. Finally, the assumptions and caveats regarding the use of statistical thermodynamics to model non-equilibrium reactions are discussed.
Statistical complexity measure of pseudorandom bit generators
NASA Astrophysics Data System (ADS)
González, C. M.; Larrondo, H. A.; Rosso, O. A.
2005-08-01
Pseudorandom number generators (PRNG) are extensively used in Monte Carlo simulations, gambling machines and cryptography as substitutes of ideal random number generators (RNG). Each application imposes different statistical requirements to PRNGs. As L’Ecuyer clearly states “the main goal for Monte Carlo methods is to reproduce the statistical properties on which these methods are based whereas for gambling machines and cryptology, observing the sequence of output values for some time should provide no practical advantage for predicting the forthcoming numbers better than by just guessing at random”. In accordance with different applications several statistical test suites have been developed to analyze the sequences generated by PRNGs. In a recent paper a new statistical complexity measure [Phys. Lett. A 311 (2003) 126] has been defined. Here we propose this measure, as a randomness quantifier of a PRNGs. The test is applied to three very well known and widely tested PRNGs available in the literature. All of them are based on mathematical algorithms. Another PRNGs based on Lorenz 3D chaotic dynamical system is also analyzed. PRNGs based on chaos may be considered as a model for physical noise sources and important new results are recently reported. All the design steps of this PRNG are described, and each stage increase the PRNG randomness using different strategies. It is shown that the MPR statistical complexity measure is capable to quantify this randomness improvement. The PRNG based on the chaotic 3D Lorenz dynamical system is also evaluated using traditional digital signal processing tools for comparison.
Holstila, A; Mänty, M; Rahkonen, O; Lahelma, E; Lahti, J
2017-12-01
Functioning will be an increasingly important issue in Finland over the coming decades as the proportion of the population aged 65 and older is growing significantly. However, the associations between changes in physical activity and subsequent health functioning are poorly understood. The aim of this study was to examine how changes in physical activity relate to concurrent and prospective levels of health functioning. Cohort data from the Helsinki Health Study were used. Phase 1 (n = 8960, response rate 67%, 80% women) was conducted among 40- to 60-year-old employees of the City of Helsinki in 2000-2002, phase 2 in 2007 (n = 7332, response rate 83%), and phase 3 in 2012 (n = 6814, response rate 79%). Linear mixed models were used as the main statistical method. Increasing physical activity was associated with higher concurrent and prospective levels of physical health functioning, whereas decreasing activity was associated with lower levels of physical health functioning. The associations were stronger with physical than with mental health functioning. Promoting physical activity among aging people may help to maintain their level of health functioning. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Continuum radiation from active galactic nuclei: A statistical study
NASA Technical Reports Server (NTRS)
Isobe, T.; Feigelson, E. D.; Singh, K. P.; Kembhavi, A.
1986-01-01
The physics of the continuum spectrum of active galactic nuclei (AGNs) was examined using a large data set and rigorous statistical methods. A data base was constructed for 469 objects which include radio selected quasars, optically selected quasars, X-ray selected AGNs, BL Lac objects, and optically unidentified compact radio sources. Each object has measurements of its radio, optical, X-ray core continuum luminosity, though many of them are upper limits. Since many radio sources have extended components, the core component were carefully selected out from the total radio luminosity. With survival analysis statistical methods, which can treat upper limits correctly, these data can yield better statistical results than those previously obtained. A variety of statistical tests are performed, such as the comparison of the luminosity functions in different subsamples, and linear regressions of luminosities in different bands. Interpretation of the results leads to the following tentative conclusions: the main emission mechanism of optically selected quasars and X-ray selected AGNs is thermal, while that of BL Lac objects is synchrotron; radio selected quasars may have two different emission mechanisms in the X-ray band; BL Lac objects appear to be special cases of the radio selected quasars; some compact radio sources show the possibility of synchrotron self-Compton (SSC) in the optical band; and the spectral index between the optical and the X-ray bands depends on the optical luminosity.
Moment-Based Physical Models of Broadband Clutter due to Aggregations of Fish
2013-09-30
statistical models for signal-processing algorithm development. These in turn will help to develop a capability to statistically forecast the impact of...aggregations of fish based on higher-order statistical measures describable in terms of physical and system parameters. Environmentally , these models...processing. In this experiment, we had good ground truth on (1) and (2), and had control over (3) and (4) except for environmentally -imposed restrictions
Procedure for Adapting Direct Simulation Monte Carlo Meshes
NASA Technical Reports Server (NTRS)
Woronowicz, Michael S.; Wilmoth, Richard G.; Carlson, Ann B.; Rault, Didier F. G.
1992-01-01
A technique is presented for adapting computational meshes used in the G2 version of the direct simulation Monte Carlo method. The physical ideas underlying the technique are discussed, and adaptation formulas are developed for use on solutions generated from an initial mesh. The effect of statistical scatter on adaptation is addressed, and results demonstrate the ability of this technique to achieve more accurate results without increasing necessary computational resources.
2013-03-21
10 2.3 Time Series Response Data ................................................................................. 12 2.4 Comparison of Response...to 12 evaluating the efficiency of the parameter estimates. In the past, the most popular form of response surface design used the D-optimality...as well. A model can refer to almost anything in math , statistics, or computer science. It can be any “physical, mathematical, or logical
The Wang-Landau Sampling Algorithm
NASA Astrophysics Data System (ADS)
Landau, David P.
2003-03-01
Over the past several decades Monte Carlo simulations[1] have evolved into a powerful tool for the study of wide-ranging problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, usually in the canonical ensemble, and enormous improvements have been made in performance through the implementation of novel algorithms. Nonetheless, difficulties arise near phase transitions, either due to critical slowing down near 2nd order transitions or to metastability near 1st order transitions, thus limiting the applicability of the method. We shall describe a new and different Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is estimated, all thermodynamic properties can be calculated at all temperatures. This approach can be extended to multi-dimensional parameter spaces and has already found use in classical models of interacting particles including systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc., as well as for quantum models. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).
Statistical physics and physiology: monofractal and multifractal approaches
NASA Technical Reports Server (NTRS)
Stanley, H. E.; Amaral, L. A.; Goldberger, A. L.; Havlin, S.; Peng, C. K.
1999-01-01
Even under healthy, basal conditions, physiologic systems show erratic fluctuations resembling those found in dynamical systems driven away from a single equilibrium state. Do such "nonequilibrium" fluctuations simply reflect the fact that physiologic systems are being constantly perturbed by external and intrinsic noise? Or, do these fluctuations actually, contain useful, "hidden" information about the underlying nonequilibrium control mechanisms? We report some recent attempts to understand the dynamics of complex physiologic fluctuations by adapting and extending concepts and methods developed very recently in statistical physics. Specifically, we focus on interbeat interval variability as an important quantity to help elucidate possibly non-homeostatic physiologic variability because (i) the heart rate is under direct neuroautonomic control, (ii) interbeat interval variability is readily measured by noninvasive means, and (iii) analysis of these heart rate dynamics may provide important practical diagnostic and prognostic information not obtainable with current approaches. The analytic tools we discuss may be used on a wider range of physiologic signals. We first review recent progress using two analysis methods--detrended fluctuation analysis and wavelets--sufficient for quantifying monofractual structures. We then describe recent work that quantifies multifractal features of interbeat interval series, and the discovery that the multifractal structure of healthy subjects is different than that of diseased subjects.
A Content Analysis of Physical Activity in TV Shows Popular Among Adolescents
Gietzen, Megan S.; Gollust, Sarah E.; Linde, Jennifer A.; Neumark-Sztainer, Dianne; Eisenberg, Marla E.
2017-01-01
Purpose Previous research demonstrates that television has the potential to influence youth behaviors, but little evidence exists on how television depicts physical activity (PA), an important public health priority for youth. This mixed-methods study investigates depictions of television characters’ participation in PA in the top 25 favorite shows ranked by a diverse sample of 2,793 adolescents. Method Randomly selected episodes from each show were content analyzed for PA incidents, reasons and context, and in relation to the gender and weight status of participating characters. Results A total of 374 incidents of PA were coded across 75 episodes, with an average of 5.0 incidents per episode. Although male and female characters were equally likely to engage in at least one incident of PA, male characters were involved in a statistically significantly larger proportion of PA incidents than female characters and were more likely to engage in PA for competitive sport. There was no statistically significant difference in engagement in PA or the proportion of PA incidents for characters coded as overweight compared to non-overweight characters. Conclusions Although female characters tended to be underrepresented in PA, this study reveals positive messages for how gender and weight are portrayed in relation to PA on TV. PMID:28151062
Braam, Katja I; van der Torre, Patrick; Takken, Tim; Veening, Margreet A; van Dulmen-den Broeder, Eline; Kaspers, Gertjan J L
2013-04-30
A decreased physical fitness and impaired social functioning has been reported in patients and survivors of childhood cancer. This is influenced by the negative effects of disease and treatment of childhood cancer and by behavioural and social elements. Exercise training for adults during or after cancer therapy has frequently been reported to improve physical fitness and social functioning. More recently, literature on this subject became available for children and young adults with cancer, both during and after treatment. This review aimed to evaluate the effect of a physical exercise training intervention (at home, at a physical therapy centre, or hospital based) on the physical fitness of children with cancer, in comparison with the physical fitness in a care as usual control group. The intervention needed to be offered within the first five years from diagnosis.The second aim was to assess the effects of a physical exercise training intervention in this population on fatigue, anxiety, depression, self efficacy, and health-related quality of life and to assess the adverse effects of the intervention. For this review the electronic databases of CENTRAL, MEDLINE, EMBASE, CINAHL, PEDro, and ongoing trial registries were searched on 6 September 2011. In addition, a handsearch of reference lists and conference proceedings was performed in that same month. The review included randomised controlled trials (RCTs) and clinical controlled trials (CCTs) that compared the effects of physical exercise training with no training, in people who were within the first five years of their diagnosis of childhood cancer. By the use of standardised forms two review authors independently identified studies meeting the inclusion criteria, performed the data extraction, and assessed the risk of bias. Quality of the studies was rated by using the Grading of Recommendation Assessment, Development and Evaluation (GRADE) criteria. Five articles were included in this review: four RCTs (14, 14, 28, and 51 participants) and one CCT (24 participants). In total 131 participants (74 boys, 54 girls, three unknown) were included in the analysis, all being treated for childhood acute lymphoblastic leukaemia (ALL). The study interventions were all implemented during chemotherapy treatment.The duration of the training sessions ranged from 15 to 60 minutes per session. Both the type of intervention, as well as the intervention period, which ranged from 10 weeks to two years, varied in all the included studies. In all included studies the control group received care as usual.All studies had methodological limitations, such as small numbers of participants, unclear randomisation methods, and single-blind study designs in case of an RCT.Cardiorespiratory fitness was studied by the use of the nine-minute run-walk test, the timed up-and-down stairs test, and the 20-m shuttle run test. Only the up-and-down stairs test showed significant differences between the intervention and the control group, in favour of the intervention group (P value = 0.05, no further information available).Bone mineral density was assessed in one study, in which a statistically significant difference in favour of the exercise group was identified (standardised mean difference (SMD) 1.07; 95% confidence interval (CI) 0.48 to 1.66; P value < 0.001). Body mass index was assessed in two studies. The pooled data on this item did not show a statistically significant difference between the intervention and control study group.Flexibility was assessed in three studies. In one study the active ankle dorsiflexion method was used to assess flexibility and the second study they used the passive ankle dorsiflexion test. No statistically significant difference between the intervention and control group was identified with the active ankle dorsiflexion test, whereas with the passive test method a statistically significant difference in favour of the exercise group was found (SMD 0.69; 95% CI 0.12 to 1.25; P value = 0.02). The third study assessed body flexibility by the use of the sit-and-reach distance test; no statistically significant difference between the intervention and control group was identified.One study assessed the effects of an inspiratory muscle training programme aimed to train the lung muscles and increase physical fitness. This study reported no significant effect on either inspiratory or expiratory muscle strength. Two other studies using either knee and ankle strength changes by hand-held dynamometry or the number of completed push-ups (with knees on the ground) and a peripheral quantitative computed tomography of the tibia to determine the muscle mass did not identify statistically significant differences in muscle strength/endurance.The level of daily activity, health-related quality of life, fatigue, and adverse events were assessed in one study only; for all these items no statistically significant differences between the intervention and control group were found.None of the included studies evaluated the outcomes activity energy expenditure, time spent exercising, anxiety and depression, or self efficacy. The effects of physical exercise training interventions for childhood cancer participants are not yet convincing due to small numbers of participants and insufficient study methodology. Despite that, first results show a trend towards an improved physical fitness in the intervention group compared to the control group. Changes in physical fitness were seen by improved body composition, flexibility, and cardiorespiratory fitness. However, the evidence is limited and these positive effects were not found for the other assessed outcomes, such as muscle strength/endurance, the level of daily activity, health-related quality of life, and fatigue. There is a need for more studies with comparable aims and interventions, using higher numbers of participants and for studies with another childhood cancer population than ALL only.
Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals
Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.
2016-01-01
Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116
Discussion on water resources value accounting and its application
NASA Astrophysics Data System (ADS)
Guo, Biying; Huang, Xiaorong; Ma, Kai; Gao, Linyun; Wang, Yanqiu
2018-06-01
The exploration of the compilation of natural resources balance sheet has been proposed since 2013. Several elements of water resources balance sheet have been discussed positively in China, including basic concept, framework and accounting methods, which focused on calculating the amount of water resources with statistical methods but lacked the analysis of the interrelationship between physical volume and magnitude of value. Based on the study of physical accounting of water resources balance sheet, the connotation of water resources value is analyzed in combination with research on the value of water resources in the world. What's more, the theoretical framework, form of measurement and research methods of water resources value accounting are further explored. Taking Chengdu, China as an example, the index system of water resources balance sheet in Chengdu which includes both physical and valuable volume is established to account the depletion of water resources, environmental damage and ecological water occupation caused by economic and social water use. Moreover, the water resources balance sheet in this region which reflects the negative impact of the economy on the environment is established. It provides a reference for advancing water resources management, improving government and social investment, realizing scientific and rational allocation of water resources.
Guo, Wei; Song, Binbin; Shen, Junfei; Wu, Jiong; Zhang, Chunyan; Wang, Beili; Pan, Baishen
2015-08-25
To establish an indirect reference interval based on the test results of alanine aminotransferase stored in a laboratory information system. All alanine aminotransferase results were included for outpatients and physical examinations that were stored in the laboratory information system of Zhongshan Hospital during 2014. The original data were transformed using a Box-Cox transformation to obtain an approximate normal distribution. Outliers were identified and omitted using the Chauvenet and Tukey methods. The indirect reference intervals were obtained by simultaneously applying nonparametric and Hoffmann methods. The reference change value was selected to determine the statistical significance of the observed differences between the calculated and published reference intervals. The indirect reference intervals for alanine aminotransferase of all groups were 12 to 41 U/L (male, outpatient), 12 to 48 U/L (male, physical examination), 9 to 32 U/L (female, outpatient), and 8 to 35 U/L (female, physical examination), respectively. The absolute differences when compared with the direct results were all smaller than the reference change value of alanine aminotransferase. The Box-Cox transformation combined with the Hoffmann and Tukey methods is a simple and reliable technique that should be promoted and used by clinical laboratories.
NASA Astrophysics Data System (ADS)
Goldsworthy, M. J.
2012-10-01
One of the most useful tools for modelling rarefied hypersonic flows is the Direct Simulation Monte Carlo (DSMC) method. Simulator particle movement and collision calculations are combined with statistical procedures to model thermal non-equilibrium flow-fields described by the Boltzmann equation. The Macroscopic Chemistry Method for DSMC simulations was developed to simplify the inclusion of complex thermal non-equilibrium chemistry. The macroscopic approach uses statistical information which is calculated during the DSMC solution process in the modelling procedures. Here it is shown how inclusion of macroscopic information in models of chemical kinetics, electronic excitation, ionization, and radiation can enhance the capabilities of DSMC to model flow-fields where a range of physical processes occur. The approach is applied to the modelling of a 6.4 km/s nitrogen shock wave and results are compared with those from existing shock-tube experiments and continuum calculations. Reasonable agreement between the methods is obtained. The quality of the comparison is highly dependent on the set of vibrational relaxation and chemical kinetic parameters employed.
Improving stochastic estimates with inference methods: calculating matrix diagonals.
Selig, Marco; Oppermann, Niels; Ensslin, Torsten A
2012-02-01
Estimating the diagonal entries of a matrix, that is not directly accessible but only available as a linear operator in the form of a computer routine, is a common necessity in many computational applications, especially in image reconstruction and statistical inference. Here, methods of statistical inference are used to improve the accuracy or the computational costs of matrix probing methods to estimate matrix diagonals. In particular, the generalized Wiener filter methodology, as developed within information field theory, is shown to significantly improve estimates based on only a few sampling probes, in cases in which some form of continuity of the solution can be assumed. The strength, length scale, and precise functional form of the exploited autocorrelation function of the matrix diagonal is determined from the probes themselves. The developed algorithm is successfully applied to mock and real world problems. These performance tests show that, in situations where a matrix diagonal has to be calculated from only a small number of computationally expensive probes, a speedup by a factor of 2 to 10 is possible with the proposed method. © 2012 American Physical Society
Numerical solutions of the semiclassical Boltzmann ellipsoidal-statistical kinetic model equation
Yang, Jaw-Yen; Yan, Chin-Yuan; Huang, Juan-Chen; Li, Zhihui
2014-01-01
Computations of rarefied gas dynamical flows governed by the semiclassical Boltzmann ellipsoidal-statistical (ES) kinetic model equation using an accurate numerical method are presented. The semiclassical ES model was derived through the maximum entropy principle and conserves not only the mass, momentum and energy, but also contains additional higher order moments that differ from the standard quantum distributions. A different decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. The numerical method in phase space combines the discrete-ordinate method in momentum space and the high-resolution shock capturing method in physical space. Numerical solutions of two-dimensional Riemann problems for two configurations covering various degrees of rarefaction are presented and various contours of the quantities unique to this new model are illustrated. When the relaxation time becomes very small, the main flow features a display similar to that of ideal quantum gas dynamics, and the present solutions are found to be consistent with existing calculations for classical gas. The effect of a parameter that permits an adjustable Prandtl number in the flow is also studied. PMID:25104904
Emaus, Aina; Dieli-Conwright, Christina; Xu, Xinxin; Lacey, James V.; Ingles, Sue A.; Reynolds, Peggy; Bernstein, Leslie; Henderson, Katherine D.
2012-01-01
Objective Although physical activity modulates the hypothalamic-ovarian-pituitary axis, the few studies investigating whether physical activity is associated with age at natural menopause have had mixed results. We set out to determine whether physical activity is associated with the timing of natural menopause in a large cohort of California women, overall, and by smoking history. Methods We investigated the association between long-term physical activity (hours/week/year) and age at natural menopause among 97,945 women in the California Teachers Study. Multivariable Cox proportional hazards regression methods were used to calculate hazard ratios (HRs) and 95% confidence intervals (CIs). The impact of cigarette smoking (never smoker, former-light smoker, former-heavy smoker, current-light smoker, current-heavy smoker) as an effect modifier was evaluated. Results In a multivariable model adjusting for body mass index at age 18, age at menarche, race/ethnicity, and age at first full-term pregnancy, increased physical activity was statistically significantly associated with older age at natural menopause (ptrend=0.005). Higher body mass index at age 18 (ptrend=0.0003) and older age at menarche (ptrend=0.0003) were also associated with older age at natural menopause. Hispanic ethnicity (vs. non-Hispanic whites, HR 1.17, 95% CI 1.09–1.26), current smokers (vs. never smokers, HR 1.68, 95% CI 1.60–1.75 for current-light smokers; HR 1.38, 95% CI 1.33–1.44 for current-heavy smokers) and older age at first full-term pregnancy (HR≥29, 2+ full-term pregnancies vs. <29, 2+ full-term pregnancies 1.10, 95% CI 1.06–1.14) were associated with earlier age at natural menopause. Upon stratification by smoking history, increased physical activity was statistically significantly associated with older natural menopause among heavy smokers only (HRHighest vs. Lowest quartile 0.88, 95% CI 0.81–0.97, ptrend=0.02 for former-heavy smokers; HRHighest vs. Lowest quartile 0.89, 95% CI 0.80–0.99, ptrend=0.04 for current-heavy smokers). Conclusion Age at natural menopause is a complex trait; the determinants of age at natural menopause, including physical activity, may differ by smoking status. PMID:23435025
Ennour-Idrissi, Kaoutar; Maunsell, Elizabeth; Diorio, Caroline
2015-11-05
Exposure to high levels of endogenous estrogens is a main risk factor for breast cancer in women, and in observational studies was found to be inversely associated with physical activity. The objective of the present study is to determine the effect of physical activity interventions on sex hormone levels in healthy women. Electronic databases (MEDLINE, EMBASE, CENTRAL), from inception to December 2014, and reference lists of relevant reviews and clinical trials were searched, with no language restrictions applied. Randomized controlled trials (RCTs) were included if they compared any type of exercise intervention to no intervention or other interventions, and assessed the effects on estrogens, androgens or the sex hormone binding globulin (SHBG) in cancer-free women. Following the method described in the Cochrane Handbook for Systematic Reviews of Interventions, data on populations, interventions, and outcomes were extracted, and combined using the inverse-variance method and a random-effects model. A pre-established protocol was drawn up, in which the primary outcome was the difference in circulating estradiol concentrations between the physical activity (experimental) and the control groups after intervention. Pre-specified subgroup analyses and sensitivity analysis according to the risk of bias were conducted. Data suitable for quantitative synthesis were available from 18 RCTs (1994 participants) for total estradiol and from 5 RCTs (1245 participants) for free estradiol. The overall effect of physical activity was a statistically significant decrease of both total estradiol (standardized mean difference [SMD] -0.12; 95 % confidence interval [CI] -0.20 to -0.03; P = 0.01; I (2) = 0 %) and free estradiol (SMD -0.20; 95 % CI -0.31 to -0.09; P = 0.0005; I (2) = 0 %). Subgroup analyses suggest that this effect is independent of menopausal status and is more noticeable for non-obese women and for high intensity exercise. Meta-analysis for secondary outcomes found that physical activity induces a statistically significant decline of free testosterone, androstenedione, dehydroepiandrosterone-sulfate and adiposity markers, while a significant increase of SHBG was observed. Although the effect is relatively modest, physical activity induces a decrease in circulating sex hormones and this effect is not entirely explained by weight loss. The findings emphasize the benefits of physical activity for women.
Moulding techniques in lipstick manufacture: a comparative evaluation.
Dweck, A C; Burnham, C A
1980-06-01
Synopsis This paper examines two methods of lipstick bulk manufacture: one via a direct method and the other via stock concentrates. The paper continues with a comparison of two manufactured bulks moulded in three different ways - first by split moulding, secondly by Rotamoulding, and finally by Ejectoret moulding. Full consideration is paid to time, labour and cost standards of each approach and the resultant moulding examined using some novel physical testing methods. The results of these tests are statistically analysed. Finally, on the basis of the gathered data and photomicrographical work a theoretical lipstick structure is proposed by which the results may be explained.
Stability of knotted vortices in wave chaos
NASA Astrophysics Data System (ADS)
Taylor, Alexander; Dennis, Mark
Large scale tangles of disordered filaments occur in many diverse physical systems, from turbulent superfluids to optical volume speckle to liquid crystal phases. They can exhibit particular large scale random statistics despite very different local physics. We have previously used the topological statistics of knotting and linking to characterise the large scale tangling, using the vortices of three-dimensional wave chaos as a universal model system whose physical lengthscales are set only by the wavelength. Unlike geometrical quantities, the statistics of knotting depend strongly on the physical system and boundary conditions. Although knotting patterns characterise different systems, the topology of vortices is highly unstable to perturbation, under which they may reconnect with one another. In systems of constructed knots, these reconnections generally rapidly destroy the knot, but for vortex tangles the topological statistics must be stable. Using large scale simulations of chaotic eigenfunctions, we numerically investigate the prevalence and impact of reconnection events, and their effect on the topology of the tangle.
Statistical physics of the symmetric group.
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
Statistical physics of the symmetric group
NASA Astrophysics Data System (ADS)
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
Leone, Lucia Andrea; Ward, Dianne S
2013-05-01
Obese women have lower levels of physical activity than nonobese women, but it is unclear what drives these differences. Mixed methods were used to understand why obese women have lower physical activity levels. Findings from focus groups with obese white women age 50 and older (N = 19) were used to develop psychosocial items for an online survey of white women (N = 195). After examining the relationship between weight group (obese vs. nonobese) and exercise attitudes, associated items (P < .05) were tested for potential mediation of the relationship between weight and physical activity. Obese women were less likely than nonobese women to report that they enjoy exercise (OR = 0.4, 95% CI 0.2-0.8) and were more likely to agree their weight makes exercise difficult (OR = 10.6, 95% CI 4.2-27.1), and they only exercise when trying to lose weight (OR = 3.8, 95% CI 1.6-8.9). Enjoyment and exercise for weight loss were statistically significant mediators of the relationship between weight and physical activity. Exercise interventions for obese women may be improved by focusing on exercise enjoyment and the benefits of exercise that are independent of weight loss.
Mapping sea ice leads with a coupled numeric/symbolic system
NASA Technical Reports Server (NTRS)
Key, J.; Schweiger, A. J.; Maslanik, J. A.
1990-01-01
A method is presented which facilitates the detection and delineation of leads with single-channel Landsat data by coupling numeric and symbolic procedures. The procedure consists of three steps: (1) using the dynamic threshold method, an image is mapped to a lead/no lead binary image; (2) the likelihood of fragments to be real leads is examined with a set of numeric rules; and (3) pairs of objects are examined geometrically and merged where possible. The processing ends when all fragments are merged and statistical characteristics are determined, and a map of valid lead objects are left which summarizes useful physical in the lead complexes. Direct implementation of domain knowledge and rapid prototyping are two benefits of the rule-based system. The approach is found to be more successfully applied to mid- and high-level processing, and the system can retrieve statistics about sea-ice leads as well as detect the leads.
A Method for Retrieving Ground Flash Fraction from Satellite Lightning Imager Data
NASA Technical Reports Server (NTRS)
Koshak, William J.
2009-01-01
A general theory for retrieving the fraction of ground flashes in N lightning observed by a satellite-based lightning imager is provided. An "exponential model" is applied as a physically reasonable constraint to describe the measured optical parameter distributions, and population statistics (i.e., mean, variance) are invoked to add additional constraints to the retrieval process. The retrieval itself is expressed in terms of a Bayesian inference, and the Maximum A Posteriori (MAP) solution is obtained. The approach is tested by performing simulated retrievals, and retrieval error statistics are provided. The ability to retrieve ground flash fraction has important benefits to the atmospheric chemistry community. For example, using the method to partition the existing satellite global lightning climatology into separate ground and cloud flash climatologies will improve estimates of lightning nitrogen oxides (NOx) production; this in turn will improve both regional air quality and global chemistry/climate model predictions.
Unbiased estimation of oceanic mean rainfall from satellite borne radiometer measurements
NASA Technical Reports Server (NTRS)
Mittal, M. C.
1981-01-01
The statistical properties of the radar derived rainfall obtained during the GARP Atlantic Tropical Experiment (GATE) are used to derive quantitative estimates of the spatial and temporal sampling errors associated with estimating rainfall from brightness temperature measurements such as would be obtained from a satelliteborne microwave radiometer employing a practical size antenna aperture. A basis for a method of correcting the so called beam filling problem, i.e., for the effect of nonuniformity of rainfall over the radiometer beamwidth is provided. The method presented employs the statistical properties of the observations themselves without need for physical assumptions beyond those associated with the radiative transfer model. The simulation results presented offer a validation of the estimated accuracy that can be achieved and the graphs included permit evaluation of the effect of the antenna resolution on both the temporal and spatial sampling errors.
NASA Astrophysics Data System (ADS)
Chao, Zenas C.; Bakkum, Douglas J.; Potter, Steve M.
2007-09-01
Electrically interfaced cortical networks cultured in vitro can be used as a model for studying the network mechanisms of learning and memory. Lasting changes in functional connectivity have been difficult to detect with extracellular multi-electrode arrays using standard firing rate statistics. We used both simulated and living networks to compare the ability of various statistics to quantify functional plasticity at the network level. Using a simulated integrate-and-fire neural network, we compared five established statistical methods to one of our own design, called center of activity trajectory (CAT). CAT, which depicts dynamics of the location-weighted average of spatiotemporal patterns of action potentials across the physical space of the neuronal circuitry, was the most sensitive statistic for detecting tetanus-induced plasticity in both simulated and living networks. By reducing the dimensionality of multi-unit data while still including spatial information, CAT allows efficient real-time computation of spatiotemporal activity patterns. Thus, CAT will be useful for studies in vivo or in vitro in which the locations of recording sites on multi-electrode probes are important.
NASA Astrophysics Data System (ADS)
Guillen, George; Rainey, Gail; Morin, Michelle
2004-04-01
Currently, the Minerals Management Service uses the Oil Spill Risk Analysis model (OSRAM) to predict the movement of potential oil spills greater than 1000 bbl originating from offshore oil and gas facilities. OSRAM generates oil spill trajectories using meteorological and hydrological data input from either actual physical measurements or estimates generated from other hydrological models. OSRAM and many other models produce output matrices of average, maximum and minimum contact probabilities to specific landfall or target segments (columns) from oil spills at specific points (rows). Analysts and managers are often interested in identifying geographic areas or groups of facilities that pose similar risks to specific targets or groups of targets if a spill occurred. Unfortunately, due to the potentially large matrix generated by many spill models, this question is difficult to answer without the use of data reduction and visualization methods. In our study we utilized a multivariate statistical method called cluster analysis to group areas of similar risk based on potential distribution of landfall target trajectory probabilities. We also utilized ArcView™ GIS to display spill launch point groupings. The combination of GIS and multivariate statistical techniques in the post-processing of trajectory model output is a powerful tool for identifying and delineating areas of similar risk from multiple spill sources. We strongly encourage modelers, statistical and GIS software programmers to closely collaborate to produce a more seamless integration of these technologies and approaches to analyzing data. They are complimentary methods that strengthen the overall assessment of spill risks.
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M
2016-01-01
Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr
2012-01-01
Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Direct access: factors that affect physical therapist practice in the state of Ohio.
McCallum, Christine A; DiAngelis, Tom
2012-05-01
Direct access to physical therapist services is permitted by law in the majority of states and across all practice settings. Ohio enacted such legislation in 2004; however, it was unknown how direct access had affected actual clinical practice. The purpose of this study was to describe physical therapist and physical therapist practice environment factors that affect direct access practice. A 2-phase, mixed-method descriptive study was conducted. In the first phase, focus group interviews with 32 purposively selected physical therapists were completed, which resulted in 8 themes for an electronically distributed questionnaire. In the second phase, survey questionnaires were distributed to physical therapists with an e-mail address on file with the Ohio licensing board. An adjusted return rate of 23% was achieved. Data were analyzed for descriptive statistics. A constant comparative method assessed open-ended questions for common themes and patterns. Thirty-one percent of the respondents reported using direct access in physical therapist practice; however, 80% reported they would practice direct access if provided the opportunity. Physical therapists who practiced direct access were more likely to be in practice 6 years or more and hold advanced degrees beyond the entry level, were American Physical Therapy Association members, and had supportive management and organizational practice policies. The direct access physical therapist practice was generally a locally owned suburban private practice or a school-based clinic that saw approximately 6% to 10% of its patients by direct access. The majority of patients treated were adults with musculoskeletal or neuromuscular impairments. Nonresponse from e-mail may be associated with sample frame bias. Implementation of a direct access physical therapist practice model is evident in Ohio. Factors related to reimbursement and organizational policy appear to impede the process.
Johnson, Clifford L; Dohrmann, Sylvia M; Kerckove, Van de; Diallo, Mamadou S; Clark, Jason; Mohadjer, Leyla K; Burt, Vicki L
2014-11-01
The National Health and Nutrition Examination Survey's (NHANES) National Youth Fitness Survey (NNYFS) was conducted in 2012 by the Centers for Disease Control and Prevention's National Center for Health Statistics (NCHS). NNYFS collected data on physical activity and fitness levels to evaluate the health and fitness of children aged 3-15 in the United States. The survey comprised three levels of data collection: a household screening interview (or screener), an in-home personal interview, and a physical examination. The screener's primary objective was to determine whether any children in the household were eligible for the interview and examination. Eligibility was determined by preset selection probabilities for desired sex-age subdomains. After selection, the in-home personal interview collected demographic, health, physical activity, and nutrition information about the child as well as information about the household. The examination included physical measurements and fitness tests. This report provides background on the NNYFS program and summarizes the survey's sample design specifications. The report presents NNYFS estimation procedures, including the methods used to calculate survey weights for the full sample as well as a combined NHANES/NNYFS sample for 2012 (accessible only through the NCHS Research Data Center). The report also describes appropriate variance estimation methods. Documentation of the sample selection methods, survey content, data collection procedures, and methods to assess nonsampling errors are reported elsewhere. All material appearing in this report is in the public domain and may be reproduced or copied without permission; citation as to source, however, is appreciated.
Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.
NASA Astrophysics Data System (ADS)
Vinh, T.
1980-08-01
There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.
The effects of estimation of censoring, truncation, transformation and partial data vectors
NASA Technical Reports Server (NTRS)
Hartley, H. O.; Smith, W. B.
1972-01-01
The purpose of this research was to attack statistical problems concerning the estimation of distributions for purposes of predicting and measuring assembly performance as it appears in biological and physical situations. Various statistical procedures were proposed to attack problems of this sort, that is, to produce the statistical distributions of the outcomes of biological and physical situations which, employ characteristics measured on constituent parts. The techniques are described.
Statistical Physics for Adaptive Distributed Control
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
A viewgraph presentation on statistical physics for distributed adaptive control is shown. The topics include: 1) The Golden Rule; 2) Advantages; 3) Roadmap; 4) What is Distributed Control? 5) Review of Information Theory; 6) Iterative Distributed Control; 7) Minimizing L(q) Via Gradient Descent; and 8) Adaptive Distributed Control.
NASA Astrophysics Data System (ADS)
Song, Wanjuan; Mu, Xihan; Ruan, Gaiyan; Gao, Zhan; Li, Linyuan; Yan, Guangjian
2017-06-01
Normalized difference vegetation index (NDVI) of highly dense vegetation (NDVIv) and bare soil (NDVIs), identified as the key parameters for Fractional Vegetation Cover (FVC) estimation, are usually obtained with empirical statistical methods However, it is often difficult to obtain reasonable values of NDVIv and NDVIs at a coarse resolution (e.g., 1 km), or in arid, semiarid, and evergreen areas. The uncertainty of estimated NDVIs and NDVIv can cause substantial errors in FVC estimations when a simple linear mixture model is used. To address this problem, this paper proposes a physically based method. The leaf area index (LAI) and directional NDVI are introduced in a gap fraction model and a linear mixture model for FVC estimation to calculate NDVIv and NDVIs. The model incorporates the Moderate Resolution Imaging Spectroradiometer (MODIS) Bidirectional Reflectance Distribution Function (BRDF) model parameters product (MCD43B1) and LAI product, which are convenient to acquire. Two types of evaluation experiments are designed 1) with data simulated by a canopy radiative transfer model and 2) with satellite observations. The root-mean-square deviation (RMSD) for simulated data is less than 0.117, depending on the type of noise added on the data. In the real data experiment, the RMSD for cropland is 0.127, for grassland is 0.075, and for forest is 0.107. The experimental areas respectively lack fully vegetated and non-vegetated pixels at 1 km resolution. Consequently, a relatively large uncertainty is found while using the statistical methods and the RMSD ranges from 0.110 to 0.363 based on the real data. The proposed method is convenient to produce NDVIv and NDVIs maps for FVC estimation on regional and global scales.
Appplication of statistical mechanical methods to the modeling of social networks
NASA Astrophysics Data System (ADS)
Strathman, Anthony Robert
With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.
Automatic location of L/H transition times for physical studies with a large statistical basis
NASA Astrophysics Data System (ADS)
González, S.; Vega, J.; Murari, A.; Pereira, A.; Dormido-Canto, S.; Ramírez, J. M.; contributors, JET-EFDA
2012-06-01
Completely automatic techniques to estimate and validate L/H transition times can be essential in L/H transition analyses. The generation of databases with hundreds of transition times and without human intervention is an important step to accomplish (a) L/H transition physics analysis, (b) validation of L/H theoretical models and (c) creation of L/H scaling laws. An entirely unattended methodology is presented in this paper to build large databases of transition times in JET using time series. The proposed technique has been applied to a dataset of 551 JET discharges between campaigns C21 and C26. A prediction with discharges that show a clear signature in time series is made through the locating properties of the wavelet transform. It is an accurate prediction and the uncertainty interval is ±3.2 ms. The discharges with a non-clear pattern in the time series use an L/H mode classifier based on discharges with a clear signature. In this case, the estimation error shows a distribution with mean and standard deviation of 27.9 ms and 37.62 ms, respectively. Two different regression methods have been applied to the measurements acquired at the transition times identified by the automatic system. The obtained scaling laws for the threshold power are not significantly different from those obtained using the data at the transition times determined manually by the experts. The automatic methods allow performing physical studies with a large number of discharges, showing, for example, that there are statistically different types of transitions characterized by different scaling laws.
Comparative study on perceived abuse and social neglect among rural and urban geriatric population
Kaur, Jaspreet; Kaur, Jasbir; Sujata, N.
2015-01-01
Context: Elder abuse and social neglect are unrecognized problem. Many forms of elder abuse exist including physical, psychological, financial, sexual and social neglect. Social neglect is experienced by elderly through loss of friends and family members. Aim: Comparison of perceived abuse and social neglect among elderly residing in selected rural and urban areas. Settings and Design: Study setting was a rural area Pohir and urban area Jamalpur of district Ludhiana. Subjects and Methods: A sample of 200 subjects (100 subjects each from rural and urban area respectively) of age 60 years and above was drawn by cluster sampling technique and interview method was used to collect data by using Likert scale. Statistical Analysis: Descriptive and inferential statistics were carried out with SPSS package. Results: Results of the present study revealed that perceived physical abuse (25%) was higher among elderly residing in rural and it was found significantly higher among female elderly who were illiterate, widow/widower and partially dependent on caregiver whereas perceived psychological abuse (71%), financial abuse (37%) and social neglect (74%) were higher among elderly residing in urban. A significant association was found between psychological abuse and educational status, which inferred that as the level of education increases perception of psychological abuse also increases. The perceived financial abuse was significantly higher among male elderly who were financially independent. Conclusion: It was concluded that social neglect was most common, followed by psychological abuse and financial abuse among elderly residing in urban whereas physical abuse was more prevalent among elderly residing in rural. PMID:26816425
Higgins, Agnes; Sharek, Danika; Nolan, Maeve; Sheerin, Barbara; Flanagan, Paul; Slaicuinaite, Sniguole; Mc Donnell, Sinead; Walsh, Heather
2012-11-01
. To report a study evaluating the effectiveness of a 1-day interdisciplinary sexuality education programme for staff working with people with acquired physical disability. Changes associated with an acquired physical disability can diminish a person's self-esteem, sense of attractiveness, relationships, and sexual functioning. Research suggests that people are dissatisfied with the quality of information and support around sexuality during their rehabilitation. A mixed methods design was used, involving pretest and posttest questionnaires and interviews. Questionnaire data were analysed using descriptive statistics and paired samples t-tests to evaluate the effects of the programme on knowledge, skills, and comfort. Interview data were analyzed thematically, with particular emphasis on participants' opinions about the application of the course within practice. Participants were working in the area of acquired disability and rehabilitation, and were drawn from a number of disciplines. Data were collected between 2008-2009. Comparison of the pre- and postmeasures, based on paired samples t-tests, showed that the programme statistically significantly increased participants' knowledge, skills, and comfort. Participants felt positive and enthusiastic about the programme and reported numerous incidents where they were more willing to raise issues for discussion and create a supportive listening space for patients to talk about their concerns around sexuality. Providing healthcare practitioners with a 1-day programme leads to positive changes in knowledge, skills, and comfort towards sexuality. Sexuality education may be an ideal topic for bringing practitioners together within an interdisciplinary education context. © 2012 Blackwell Publishing Ltd.
Chagpar, Anees B.; Middleton, Lavinia P.; Sahin, Aysegul A.; Dempsey, Peter; Buzdar, Aman U.; Mirza, Attiqa N.; Ames, Fredrick C.; Babiera, Gildy V.; Feig, Barry W.; Hunt, Kelly K.; Kuerer, Henry M.; Meric-Bernstam, Funda; Ross, Merrick I.; Singletary, S Eva
2006-01-01
Objective: To assess the accuracy of physical examination, ultrasonography, and mammography in predicting residual size of breast tumors following neoadjuvant chemotherapy. Background: Neoadjuvant chemotherapy is an accepted part of the management of stage II and III breast cancer. Accurate prediction of residual pathologic tumor size after neoadjuvant chemotherapy is critical in guiding surgical therapy. Although physical examination, ultrasonography, and mammography have all been used to predict residual tumor size, there have been conflicting reports about the accuracy of these methods in the neoadjuvant setting. Methods: We reviewed the records of 189 patients who participated in 1 of 2 protocols using doxorubicin-containing neoadjuvant chemotherapy, and who had assessment by physical examination, ultrasonography, and/or mammography no more than 60 days before their surgical resection. Size correlations were performed using Spearman rho analysis. Clinical and pathologic measurements were also compared categorically using the weighted kappa statistic. Results: Size estimates by physical examination, ultrasonography, and mammography were only moderately correlated with residual pathologic tumor size after neoadjuvant chemotherapy (correlation coefficients: 0.42, 0.42, and 0.41, respectively), with an accuracy of ±1 cm in 66% of patients by physical examination, 75% by ultrasonography, and 70% by mammography. Kappa values (0.24–0.35) indicated poor agreement between clinical and pathologic measurements. Conclusion: Physical examination, ultrasonography, and mammography were only moderately useful for predicting residual pathologic tumor size after neoadjuvant chemotherapy. PMID:16432360
Liu, C K; Leng, X; Hsu, F-C; Kritchevsky, S B; Ding, J; Earnest, C P; Ferrucci, L; Goodpaster, B H; Guralnik, J M; Lenchik, L; Pahor, M; Fielding, R A
2014-01-01
To determine if sarcopenia modulates the response to a physical activity intervention in functionally limited older adults. Secondary analysis of a randomized controlled trial. Three academic centers. Elders aged 70 to 89 years at risk for mobility disability who underwent dual-energy x-ray absorptiometry (DXA) for body composition at enrollment and follow-up at twelve months (N = 177). Subjects participated in a physical activity program (PA) featuring aerobic, strength, balance, and flexibility training, or a successful aging (SA) educational program about healthy aging. Sarcopenia as determined by measuring appendicular lean mass and adjusting for height and total body fat mass (residuals method), Short Physical Performance Battery score (SPPB), and gait speed determined on 400 meter course. At twelve months, sarcopenic and non-sarcopenic subjects in PA tended to have higher mean SPPB scores (8.7±0.5 and 8.7±0.2 points) compared to sarcopenic and non-sarcopenic subjects in SA (8.3±0.5 and 8.4±0.2 points, p = 0.24 and 0.10), although the differences were not statistically significant. At twelve months, faster mean gait speeds were observed in PA: 0.93±0.4 and 0.95±0.03 meters/second in sarcopenic and non-sarcopenic PA subjects, and 0.89±0.4 and 0.91±0.03 meters/second in sarcopenic and non-sarcopenic SA subjects (p = 0.98 and 0.26), although not statistically significant. There was no difference between the sarcopenic and non-sarcopenic groups in intervention adherence or number of adverse events. These data suggest that older adults with sarcopenia, who represent a vulnerable segment of the elder population, are capable of improvements in physical performance after a physical activity intervention.
Physical therapy treatments for low back pain in children and adolescents: a meta-analysis
2013-01-01
Background Low back pain (LBP) in adolescents is associated with LBP in later years. In recent years treatments have been administered to adolescents for LBP, but it is not known which physical therapy treatment is the most efficacious. By means of a meta-analysis, the current study investigated the effectiveness of the physical therapy treatments for LBP in children and adolescents. Methods Studies in English, Spanish, French, Italian and Portuguese, and carried out by March 2011, were selected by electronic and manual search. Two independent researchers coded the moderator variables of the studies, and performed the effect size calculations. The mean effect size index used was the standardized mean change between the pretest and posttest, and it was applied separately for each combination of outcome measures, (pain, disability, flexibility, endurance and mental health) and measurement type (self-reports, and clinician assessments). Results Eight articles that met the selection criteria enabled us to define 11 treatment groups and 5 control groups using the group as the unit of analysis. The 16 groups involved a total sample of 334 subjects at the posttest (221 in the treatment groups and 113 in the control groups). For all outcome measures, the average effect size of the treatment groups was statistically and clinically significant, whereas the control groups had negative average effect sizes that were not statistically significant. Conclusions Of all the physical therapy treatments for LBP in children and adolescents, the combination of therapeutic physical conditioning and manual therapy is the most effective. The low number of studies and control groups, and the methodological limitations in this meta-analysis prevent us from drawing definitive conclusions in relation to the efficacy of physical therapy treatments in LBP. PMID:23374375
Rotter, Iwona; Kotwas, Artur; Kemicer-Chmielewska, Ewa; Watral, Aleksandra
Violence among adolescents is one of the most serious problems, and has significantly increased in recent years. Studies conducted in 2011 on aggression and violence in schools reported that the most widespread form of offence is verbal aggression. As many as 63% of students had experience of being ridiculed, humiliated or offended. According to research, 33% of students suffered from physical aggression (PA). The aim of the study to evaluate the relationship between physical activity and the incidence of aggressive behaviour in adolescents of lower -secondary school age. The study was conducted in autumn 2013, in West Pomerania province in Poland, among 807 students of a lower -secondary school. A diagnostic survey method with a standardized questionnaire (Aggression Questionnaire) by H. Buss and M. Perry of 1992 (Amity version) was used. Statistical analysis was performed in Statistica Pl version 10 using the Kruskal–Wallis test and the U Mann–Whitney test with a significance level of p ≤ 0.05. The Kruskal–Wallis test showed only a statistically significant difference between the groups surveyed in relation to the frequency of attendance at additional sports activities and the level of PA (p = 0.02). Subjects who engage in team sports show a higher tendency for physical aggression than those engaged in individual sports. Taking into account the higher level of PA in adolescents performing sport 3–4 times a week compared to physically inactive and very active adolescents (more than 5 times a week), it may be presumed that sport attracts people with high physical aggression tendencies. Only with strong involvement in sport is the level of aggression reduced. There is a need for research on the personality of young athletes, which would give credence to this thesis.
Li, Weikai; Wang, Zhengxia; Zhang, Limei; Qiao, Lishan; Shen, Dinggang
2017-01-01
Functional brain network (FBN) has been becoming an increasingly important way to model the statistical dependence among neural time courses of brain, and provides effective imaging biomarkers for diagnosis of some neurological or psychological disorders. Currently, Pearson's Correlation (PC) is the simplest and most widely-used method in constructing FBNs. Despite its advantages in statistical meaning and calculated performance, the PC tends to result in a FBN with dense connections. Therefore, in practice, the PC-based FBN needs to be sparsified by removing weak (potential noisy) connections. However, such a scheme depends on a hard-threshold without enough flexibility. Different from this traditional strategy, in this paper, we propose a new approach for estimating FBNs by remodeling PC as an optimization problem, which provides a way to incorporate biological/physical priors into the FBNs. In particular, we introduce an L 1 -norm regularizer into the optimization model for obtaining a sparse solution. Compared with the hard-threshold scheme, the proposed framework gives an elegant mathematical formulation for sparsifying PC-based networks. More importantly, it provides a platform to encode other biological/physical priors into the PC-based FBNs. To further illustrate the flexibility of the proposed method, we extend the model to a weighted counterpart for learning both sparse and scale-free networks, and then conduct experiments to identify autism spectrum disorders (ASD) from normal controls (NC) based on the constructed FBNs. Consequently, we achieved an 81.52% classification accuracy which outperforms the baseline and state-of-the-art methods.
Maximum entropy models of ecosystem functioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertram, Jason, E-mail: jason.bertram@anu.edu.au
2014-12-05
Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less
Statistical physics of vehicular traffic and some related systems
NASA Astrophysics Data System (ADS)
Chowdhury, Debashish; Santen, Ludger; Schadschneider, Andreas
2000-05-01
In the so-called “microscopic” models of vehicular traffic, attention is paid explicitly to each individual vehicle each of which is represented by a “particle”; the nature of the “interactions” among these particles is determined by the way the vehicles influence each others’ movement. Therefore, vehicular traffic, modeled as a system of interacting “particles” driven far from equilibrium, offers the possibility to study various fundamental aspects of truly nonequilibrium systems which are of current interest in statistical physics. Analytical as well as numerical techniques of statistical physics are being used to study these models to understand rich variety of physical phenomena exhibited by vehicular traffic. Some of these phenomena, observed in vehicular traffic under different circumstances, include transitions from one dynamical phase to another, criticality and self-organized criticality, metastability and hysteresis, phase-segregation, etc. In this critical review, written from the perspective of statistical physics, we explain the guiding principles behind all the main theoretical approaches. But we present detailed discussions on the results obtained mainly from the so-called “particle-hopping” models, particularly emphasizing those which have been formulated in recent years using the language of cellular automata.
Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra
NASA Astrophysics Data System (ADS)
Karstens, William; Smith, David
2013-03-01
Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.
Analysis of cancer-related fatigue based on smart bracelet devices.
Shen, Hong; Hou, Honglun; Tian, Wei; Wu, MingHui; Chen, Tianzhou; Zhong, Xian
2016-01-01
Fatigue is the most common symptom associated with cancer and its treatment, and profoundly affects all aspects of quality of life for cancer patients. It is very important to measure and manage cancer-related fatigue. Usually, the cancer-related fatigue scores, which estimate the degree of fatigue, are self-reported by cancer patients using standardized assessment tools. But most of the classical methods used for measurement of fatigue are subjective and inconvenient. In this study, we try to establish a new method to assess cancer-related fatigue objectively and accurately by using smart bracelet. All patients with metastatic pancreatic cancer wore smart bracelet for recording the physical activity including step count and sleep time before and after chemotherapy. Meantime, their psychological state was assessed by completing questionnaire tables as cancer-related fatigue scores. Step count record by smart bracelet reflecting the physical performance dramatically decreased in the initial days of chemotherapy and recovered in the next few days. Statistical analysis showed a strong and significant correlation between self-reported cancer-related fatigue and physical performance (P= 0.000, r=-0.929). Sleep time was also significantly correlated with fatigue (P= 0.000, r= 0.723). Multiple regression analysis showed that physical performance and sleep time are significant predictors of fatigue. Measuring activity using smart bracelets may be an appropriate method for quantitative and objective measurement of cancer-related fatigue by using smart bracelet devices.
Swaminathan Iyer, K; Gaikwad, R M; Woodworth, C D; Volkov, D O; Sokolov, Igor
2012-06-01
A significant change of surface features of malignant cervical epithelial cells compared to normal cells has been previously reported. Here, we are studying the question at which progressive stage leading to cervical cancer the surface alteration happens. A non-traditional method to identify malignant cervical epithelial cells in vitro, which is based on physical (in contrast to specific biochemical) labelling of cells with fluorescent silica micron-size beads, is used here to examine cells at progressive stages leading to cervical cancer which include normal epithelial cells, cells infected with human papillomavirus type-16 (HPV-16), cells immortalized by HPV-16, and carcinoma cells. The study shows a statistically significant (at p < 0.01) difference between both immortal and cancer cells and a group consisting of normal and infected. There is no significant difference between normal and infected cells. Immortal cells demonstrate the signal which is closer to cancer cells than to either normal or infected cells. This implies that the cell surface, surface cellular brush changes substantially when cells become immortal. Physical labeling of the cell surface represents a substantial departure from the traditional biochemical labeling methods. The results presented show the potential significance of physical properties of the cell surface for development of clinical methods for early detection of cervical cancer, even at the stage of immortalized, premalignant cells.
Iyer, K. Swaminathan; Gaikwad, R. M.; Woodworth, C. D.; Volkov, D. O.
2013-01-01
A significant change of surface features of malignant cervical epithelial cells compared to normal cells has been previously reported. Here, we are studying the question at which progressive stage leading to cervical cancer the surface alteration happens. A non-traditional method to identify malignant cervical epithelial cells in vitro, which is based on physical (in contrast to specific biochemical) labelling of cells with fluorescent silica micron-size beads, is used here to examine cells at progressive stages leading to cervical cancer which include normal epithelial cells, cells infected with human papillomavirus type-16 (HPV-16), cells immortalized by HPV-16, and carcinoma cells. The study shows a statistically significant (at p <0.01) difference between both immortal and cancer cells and a group consisting of normal and infected. There is no significant difference between normal and infected cells. Immortal cells demonstrate the signal which is closer to cancer cells than to either normal or infected cells. This implies that the cell surface, surface cellular brush changes substantially when cells become immortal. Physical labeling of the cell surface represents a substantial departure from the traditional biochemical labeling methods. The results presented show the potential significance of physical properties of the cell surface for development of clinical methods for early detection of cervical cancer, even at the stage of immortalized, pre-malignant cells. PMID:22351422
Identifying trends in climate: an application to the cenozoic
NASA Astrophysics Data System (ADS)
Richards, Gordon R.
1998-05-01
The recent literature on trending in climate has raised several issues, whether trends should be modeled as deterministic or stochastic, whether trends are nonlinear, and the relative merits of statistical models versus models based on physics. This article models trending since the late Cretaceous. This 68 million-year interval is selected because the reliability of tests for trending is critically dependent on the length of time spanned by the data. Two main hypotheses are tested, that the trend has been caused primarily by CO2 forcing, and that it reflects a variety of forcing factors which can be approximated by statistical methods. The CO2 data is obtained from model simulations. Several widely-used statistical models are found to be inadequate. ARIMA methods parameterize too much of the short-term variation, and do not identify low frequency movements. Further, the unit root in the ARIMA process does not predict the long-term path of temperature. Spectral methods also have little ability to predict temperature at long horizons. Instead, the statistical trend is estimated using a nonlinear smoothing filter. Both of these paradigms make it possible to model climate as a cointegrated process, in which temperature can wander quite far from the trend path in the intermediate term, but converges back over longer horizons. Comparing the forecasting properties of the two trend models demonstrates that the optimal forecasting model includes CO2 forcing and a parametric representation of the nonlinear variability in climate.
Structures and Statistics of Citation Networks
2011-05-01
the citations among them. The papers are in the field of high- energy physics, and they were added to the online library between 1992-2003. Each paper... energy , physics:astrophysics, mathematics, computer science, statistics and many others. The value of the setSpec field can be any of these. However...the value of the categories field might contain multiple set names listed. For instance, a paper can primarily be considered as a high- energy physics
NASA Astrophysics Data System (ADS)
Kumar, Rakesh; Li, Zheng; Levin, Deborah A.
2011-05-01
In this work, we propose a new heat accommodation model to simulate freely expanding homogeneous condensation flows of gaseous carbon dioxide using a new approach, the statistical Bhatnagar-Gross-Krook method. The motivation for the present work comes from the earlier work of Li et al. [J. Phys. Chem. 114, 5276 (2010)] in which condensation models were proposed and used in the direct simulation Monte Carlo method to simulate the flow of carbon dioxide from supersonic expansions of small nozzles into near-vacuum conditions. Simulations conducted for stagnation pressures of one and three bar were compared with the measurements of gas and cluster number densities, cluster size, and carbon dioxide rotational temperature obtained by Ramos et al. [Phys. Rev. A 72, 3204 (2005)]. Due to the high computational cost of direct simulation Monte Carlo method, comparison between simulations and data could only be performed for these stagnation pressures, with good agreement obtained beyond the condensation onset point, in the farfield. As the stagnation pressure increases, the degree of condensation also increases; therefore, to improve the modeling of condensation onset, one must be able to simulate higher stagnation pressures. In simulations of an expanding flow of argon through a nozzle, Kumar et al. [AIAA J. 48, 1531 (2010)] found that the statistical Bhatnagar-Gross-Krook method provides the same accuracy as direct simulation Monte Carlo method, but, at one half of the computational cost. In this work, the statistical Bhatnagar-Gross-Krook method was modified to account for internal degrees of freedom for multi-species polyatomic gases. With the computational approach in hand, we developed and tested a new heat accommodation model for a polyatomic system to properly account for the heat release of condensation. We then developed condensation models in the framework of the statistical Bhatnagar-Gross-Krook method. Simulations were found to agree well with the experiment for all stagnation pressure cases (1-5 bar), validating the accuracy of the Bhatnagar-Gross-Krook based condensation model in capturing the physics of condensation.
Integration of data-driven and physically-based methods to assess shallow landslides susceptibility
NASA Astrophysics Data System (ADS)
Lajas, Sara; Oliveira, Sérgio C.; Zêzere, José Luis
2016-04-01
Approaches used to assess shallow landslides susceptibility at the basin scale are conceptually different depending on the use of statistic or deterministic methods. The data-driven methods are sustained in the assumption that the same causes are likely to produce the same effects and for that reason a present/past landslide inventory and a dataset of factors assumed as predisposing factors are crucial for the landslide susceptibility assessment. The physically-based methods are based on a system controlled by physical laws and soil mechanics, where the forces which tend to promote movement are compared with forces that tend to promote resistance to movement. In this case, the evaluation of susceptibility is supported by the calculation of the Factor of safety (FoS), and dependent of the availability of detailed data related with the slope geometry and hydrological and geotechnical properties of the soils and rocks. Within this framework, this work aims to test two hypothesis: (i) although conceptually distinct and based on contrasting procedures, statistic and deterministic methods generate similar shallow landslides susceptibility results regarding the predictive capacity and spatial agreement; and (ii) the integration of the shallow landslides susceptibility maps obtained with data-driven and physically-based methods, for the same study area, generate a more reliable susceptibility model for shallow landslides occurrence. To evaluate these two hypotheses, we select the Information Value data-driven method and the physically-based Infinite Slope model to evaluate shallow landslides in the study area of Monfalim and Louriceira basins (13.9 km2), which is located in the north of Lisbon region (Portugal). The landslide inventory is composed by 111 shallow landslides and was divide in two independent groups based on temporal criteria (age ≤ 1983 and age > 1983): (i) the modelling group (51 cases) was used to define the weights for each predisposing factor (lithology, land use, slope, aspect, curvature, topographic position index and the slope over area ratio) with the Information Value method and was used also to calibrate the strength parameters (cohesion and friction angle) of the different lithological units considered in the Infinity Slope model; and (ii) the validation group (60 cases) was used to independent validate and define the predictive capacity of the shallow landslides susceptibility maps produced with the Information Value method and the Infinite Slope method. The comparison of both landslide susceptibility maps was supported by: (i) the computation of the Receiver Operator Characteristic (ROC) curves; (ii) the calculation of the Area Under the Curve (AUC); and (iii) the evaluation of the spatial agreement between the landslide susceptibility classes. Finally, the susceptibility maps produced with the Information Value and the Infinite Slope methods are integrated into a single landslide susceptibility map based on a set of integration rules define by cross-validation of the susceptibility classes of both maps and analysis of the corresponding contingency table. This work was supported by the FCT - Portuguese Foundation for Science and Technology and is within the framework of the FORLAND Project. Sérgio Oliveira was funded by a postdoctoral grant (SFRH/BPD/85827/2012) from the Portuguese Foundation for Science and Technology (FCT).
Rathi, Monika; Ahrenkiel, S P; Carapella, J J; Wanlass, M W
2013-02-01
Given an unknown multicomponent alloy, and a set of standard compounds or alloys of known composition, can one improve upon popular standards-based methods for energy dispersive X-ray (EDX) spectrometry to quantify the elemental composition of the unknown specimen? A method is presented here for determining elemental composition of alloys using transmission electron microscopy-based EDX with appropriate standards. The method begins with a discrete set of related reference standards of known composition, applies multivariate statistical analysis to those spectra, and evaluates the compositions with a linear matrix algebra method to relate the spectra to elemental composition. By using associated standards, only limited assumptions about the physical origins of the EDX spectra are needed. Spectral absorption corrections can be performed by providing an estimate of the foil thickness of one or more reference standards. The technique was applied to III-V multicomponent alloy thin films: composition and foil thickness were determined for various III-V alloys. The results were then validated by comparing with X-ray diffraction and photoluminescence analysis, demonstrating accuracy of approximately 1% in atomic fraction.
Gao, Yang; Bian, Zhaoying; Huang, Jing; Zhang, Yunwan; Niu, Shanzhou; Feng, Qianjin; Chen, Wufan; Liang, Zhengrong; Ma, Jianhua
2014-06-16
To realize low-dose imaging in X-ray computed tomography (CT) examination, lowering milliampere-seconds (low-mAs) or reducing the required number of projection views (sparse-view) per rotation around the body has been widely studied as an easy and effective approach. In this study, we are focusing on low-dose CT image reconstruction from the sinograms acquired with a combined low-mAs and sparse-view protocol and propose a two-step image reconstruction strategy. Specifically, to suppress significant statistical noise in the noisy and insufficient sinograms, an adaptive sinogram restoration (ASR) method is first proposed with consideration of the statistical property of sinogram data, and then to further acquire a high-quality image, a total variation based projection onto convex sets (TV-POCS) method is adopted with a slight modification. For simplicity, the present reconstruction strategy was termed as "ASR-TV-POCS." To evaluate the present ASR-TV-POCS method, both qualitative and quantitative studies were performed on a physical phantom. Experimental results have demonstrated that the present ASR-TV-POCS method can achieve promising gains over other existing methods in terms of the noise reduction, contrast-to-noise ratio, and edge detail preservation.
Risk Factors for Physical Impairment after Acute Lung Injury in a National, Multicenter Study
Wozniak, Amy W.; Hough, Catherine L.; Morris, Peter E.; Dinglas, Victor D.; Jackson, James C.; Mendez-Tellez, Pedro A.; Shanholtz, Carl; Ely, E. Wesley; Colantuoni, Elizabeth
2014-01-01
Rationale: Existing studies of risk factors for physical impairments in acute lung injury (ALI) survivors were potentially limited by single-center design or relatively small sample size. Objectives: To evaluate risk factors for three measures of physical impairments commonly experienced by survivors of ALI in the first year after hospitalization. Methods: A prospective, longitudinal study of 6- and 12-month physical outcomes (muscle strength, 6-minute-walk distance, and Short Form [SF]-36 Physical Function score) for 203 survivors of ALI enrolled from 12 hospitals participating in the ARDS Network randomized trials. Multivariable regression analyses evaluated the independent association of critical illness–related variables and intensive care interventions with impairments in each physical outcome measure, after adjusting for patient demographics, comorbidities, and baseline functional status. Measurements and Main Results: At 6 and 12 months, respectively, mean (± SD) values for strength (presented as proportion of maximum strength score evaluated using manual muscle testing) was 92% (± 8%) and 93% (± 9%), 6-minute-walk distance (as percent-predicted) was 64% (± 22%) and 67% (± 26%), and SF-36 Physical Function score (as percent-predicted) was 61% (± 36%) and 67% (± 37%). After accounting for patient baseline status, there was significant association and statistical interaction of mean daily dose of corticosteroids and intensive care unit length of stay with impairments in physical outcomes. Conclusions: Patients had substantial impairments, from predicted values, for 6-minute-walk distance and SF-36 Physical Function outcome measures. Minimizing corticosteroid dose and implementing existing evidence-based methods to reduce duration of intensive care unit stay and associated patient immobilization may be important interventions for improving ALI survivors’ physical outcomes. PMID:24716641
NASA Technical Reports Server (NTRS)
Forbes, G. S.; Pielke, R. A.
1985-01-01
Various empirical and statistical weather-forecasting studies which utilize stratification by weather regime are described. Objective classification was used to determine weather regime in some studies. In other cases the weather pattern was determined on the basis of a parameter representing the physical and dynamical processes relevant to the anticipated mesoscale phenomena, such as low level moisture convergence and convective precipitation, or the Froude number and the occurrence of cold-air damming. For mesoscale phenomena already in existence, new forecasting techniques were developed. The use of cloud models in operational forecasting is discussed. Models to calculate the spatial scales of forcings and resultant response for mesoscale systems are presented. The use of these models to represent the climatologically most prevalent systems, and to perform case-by-case simulations is reviewed. Operational implementation of mesoscale data into weather forecasts, using both actual simulation output and method-output statistics is discussed.
Sorensen, Glorian; Barbeau, Elizabeth; Stoddard, Anne M.; Hunt, Mary Kay; Kaphingst, Kimberly; Wallace, Lorraine
2005-01-01
Objectives. We examined the efficacy of a cancer prevention intervention designed to improve health behaviors among working-class, multiethnic populations employed in small manufacturing businesses. Methods. Worksites were randomly assigned to an intervention or minimal-intervention control condition. The intervention targeted fruit and vegetable consumption, red meat consumption, multivitamin use, and physical activity. Results. Employees in the intervention group showed greater improvements for every outcome compared with employees in the control group. Differences in improvement were statistically significant for multivitamin use and physical activity. Intervention effects were larger among workers than among managers for fruit and vegetable consumption and for physical activity. Conclusions. The social-context model holds promise for reducing disparities in health behaviors. Further research is needed to improve the effectiveness of the intervention. PMID:16006422
The Correlation between Physical Environment and Student Engagement
NASA Astrophysics Data System (ADS)
Carmona-Reyes, Jorge; Wang, Li; Matthews, Lorin; Cook, Mike; Hyde, Truell
2016-10-01
In its second year of an educational research collaboration on the convergence between physical environment, pedagogical methods, student attainment and academic performance, CASPER along with the Region 12 Education Service Center and Huckabee Inc. have completed their initial quantitative study. This project examined the impact of the physical environment on student engagement, employing a flexibility construct and examination of teacher mobility and places of centeredness. Data analysis showed a positive correlation between student engagement and classroom flexibility for two locations having statistically significant differences in flexibility scores. The research is now being extended to examine a laboratory setting (in this case, a complex plasma lab) where the results will be used to enhance student work efficiency while also increasing safety within the lab. Details will be discussed in this presentation. Region 12 and Huckabee funding is gratefully acknowledged.
Generation of physical random numbers by using homodyne detection
NASA Astrophysics Data System (ADS)
Hirakawa, Kodai; Oya, Shota; Oguri, Yusuke; Ichikawa, Tsubasa; Eto, Yujiro; Hirano, Takuya; Tsurumaru, Toyohiro
2016-10-01
Physical random numbers generated by quantum measurements are, in principle, impossible to predict. We have demonstrated the generation of physical random numbers by using a high-speed balanced photodetector to measure the quadrature amplitudes of vacuum states. Using this method, random numbers were generated at 500 Mbps, which is more than one order of magnitude faster than previously [Gabriel et al:, Nature Photonics 4, 711-715 (2010)]. The Crush test battery of the TestU01 suite consists of 31 tests in 144 variations, and we used them to statistically analyze these numbers. The generated random numbers passed 14 of the 31 tests. To improve the randomness, we performed a hash operation, in which each random number was multiplied by a random Toeplitz matrix; the resulting numbers passed all of the tests in the TestU01 Crush battery.
Deterministic annealing for density estimation by multivariate normal mixtures
NASA Astrophysics Data System (ADS)
Kloppenburg, Martin; Tavan, Paul
1997-03-01
An approach to maximum-likelihood density estimation by mixtures of multivariate normal distributions for large high-dimensional data sets is presented. Conventionally that problem is tackled by notoriously unstable expectation-maximization (EM) algorithms. We remove these instabilities by the introduction of soft constraints, enabling deterministic annealing. Our developments are motivated by the proof that algorithmically stable fuzzy clustering methods that are derived from statistical physics analogs are special cases of EM procedures.
NASA Astrophysics Data System (ADS)
Ferrara, Emilio
2015-03-01
Containing the spreading of crime in modern society in an ongoing battle: our understanding of the dynamics underlying criminal events and the motifs behind individuals therein involved is crucial to design cost-effective prevention policies and intervention strategies. During recent years we witnessed various research fields joining forces, sharing models and methods, toward modeling and quantitatively characterizing crime and criminal behavior.
MUSiC - Model-independent search for deviations from Standard Model predictions in CMS
NASA Astrophysics Data System (ADS)
Pieta, Holger
2010-02-01
We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )
Chang, Wei-Lung; Liu, Hsiang-Te; Lin, Tai-An; Wen, Yung-Sung
2008-01-01
The purpose of this research was to study the relationship between family communication structure, vanity trait, and related consumption behavior. The study used an empirical method with adolescent students from the northern part of Taiwan as the subjects. Multiple statistical methods and the SEM model were used for testing the hypotheses. The major findings were: (1) Socio-orientation has a significant effect on how physical appearance is viewed, and concept-orientation has a significant positive effect on achievement vanity. (2) how physical appearance is viewed has a significant positive effect on all dimensions of materialism, concerns about clothing, and use of cosmetics. (3) Achievement vanity has a significant positive relationship with price-based prestige sensitivity and concerns regarding clothing. The findings have implications for marketing theory as well as for practical applications in marketing.
Objective determination of image end-members in spectral mixture analysis of AVIRIS data
NASA Technical Reports Server (NTRS)
Tompkins, Stefanie; Mustard, John F.; Pieters, Carle M.; Forsyth, Donald W.
1993-01-01
Spectral mixture analysis has been shown to be a powerful, multifaceted tool for analysis of multi- and hyper-spectral data. Applications of AVIRIS data have ranged from mapping soils and bedrock to ecosystem studies. During the first phase of the approach, a set of end-members are selected from an image cube (image end-members) that best account for its spectral variance within a constrained, linear least squares mixing model. These image end-members are usually selected using a priori knowledge and successive trial and error solutions to refine the total number and physical location of the end-members. However, in many situations a more objective method of determining these essential components is desired. We approach the problem of image end-member determination objectively by using the inherent variance of the data. Unlike purely statistical methods such as factor analysis, this approach derives solutions that conform to a physically realistic model.
Physics-based, Bayesian sequential detection method and system for radioactive contraband
Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E
2014-03-18
A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.
Stanić, R
1993-01-01
Our long clinical experience, with observations of some authors as well, indicate that the epidemic data of the prevalence of ischaemic heart disease (I.H.D.) is significantly reduced in some physically handicapped people (the blind and the deaf-mute) if we compare them with the similar ones who have not such anomalies. With no regard to patho-physiologic mechanism of such condition, 233 examinees of both sex, chosen by the method of accidental choice, were examined by clinical, ECG, and laboratory (non- invasive) methods and divided into three groups: the blind 81 (34.76%), the deaf-mute 76 (32.61%), and industrial workers 76 (32.61%) who were taken a as control group. The obtained results show that the incidence of I.H.D. (4,56%), and the control group 11 (8,36%), which, from the point of statistics, offer a significant piece of information.
A random matrix approach to language acquisition
NASA Astrophysics Data System (ADS)
Nicolaidis, A.; Kosmidis, Kosmas; Argyrakis, Panos
2009-12-01
Since language is tied to cognition, we expect the linguistic structures to reflect patterns that we encounter in nature and are analyzed by physics. Within this realm we investigate the process of lexicon acquisition, using analytical and tractable methods developed within physics. A lexicon is a mapping between sounds and referents of the perceived world. This mapping is represented by a matrix and the linguistic interaction among individuals is described by a random matrix model. There are two essential parameters in our approach. The strength of the linguistic interaction β, which is considered as a genetically determined ability, and the number N of sounds employed (the lexicon size). Our model of linguistic interaction is analytically studied using methods of statistical physics and simulated by Monte Carlo techniques. The analysis reveals an intricate relationship between the innate propensity for language acquisition β and the lexicon size N, N~exp(β). Thus a small increase of the genetically determined β may lead to an incredible lexical explosion. Our approximate scheme offers an explanation for the biological affinity of different species and their simultaneous linguistic disparity.
Grotjahn, Richard; Black, Robert; Leung, Ruby; ...
2015-05-22
This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less
Emperical Laws in Economics Uncovered Using Methods in Statistical Mechanics
NASA Astrophysics Data System (ADS)
Stanley, H. Eugene
2001-06-01
In recent years, statistical physicists and computational physicists have determined that physical systems which consist of a large number of interacting particles obey universal "scaling laws" that serve to demonstrate an intrinsic self-similarity operating in such systems. Further, the parameters appearing in these scaling laws appear to be largely independent of the microscopic details. Since economic systems also consist of a large number of interacting units, it is plausible that scaling theory can be usefully applied to economics. To test this possibility using realistic data sets, a number of scientists have begun analyzing economic data using methods of statistical physics [1]. We have found evidence for scaling (and data collapse), as well as universality, in various quantities, and these recent results will be reviewed in this talk--starting with the most recent study [2]. We also propose models that may lead to some insight into these phenomena. These results will be discussed, as well as the overall rationale for why one might expect scaling principles to hold for complex economic systems. This work on which this talk is based is supported by BP, and was carried out in collaboration with L. A. N. Amaral S. V. Buldyrev, D. Canning, P. Cizeau, X. Gabaix, P. Gopikrishnan, S. Havlin, Y. Lee, Y. Liu, R. N. Mantegna, K. Matia, M. Meyer, C.-K. Peng, V. Plerou, M. A. Salinger, and M. H. R. Stanley. [1.] See, e.g., R. N. Mantegna and H. E. Stanley, Introduction to Econophysics: Correlations & Complexity in Finance (Cambridge University Press, Cambridge, 1999). [2.] P. Gopikrishnan, B. Rosenow, V. Plerou, and H. E. Stanley, "Identifying Business Sectors from Stock Price Fluctuations," e-print cond-mat/0011145; V. Plerou, P. Gopikrishnan, L. A. N. Amaral, X. Gabaix, and H. E. Stanley, "Diffusion and Economic Fluctuations," Phys. Rev. E (Rapid Communications) 62, 3023-3026 (2000); P. Gopikrishnan, V. Plerou, X. Gabaix, and H. E. Stanley, "Statistical Properties of Share Volume Traded in Financial Markets," Phys. Rev. E (Rapid Communications) 62, 4493-4496 (2000).
Rock physics model-based prediction of shear wave velocity in the Barnett Shale formation
NASA Astrophysics Data System (ADS)
Guo, Zhiqi; Li, Xiang-Yang
2015-06-01
Predicting S-wave velocity is important for reservoir characterization and fluid identification in unconventional resources. A rock physics model-based method is developed for estimating pore aspect ratio and predicting shear wave velocity Vs from the information of P-wave velocity, porosity and mineralogy in a borehole. Statistical distribution of pore geometry is considered in the rock physics models. In the application to the Barnett formation, we compare the high frequency self-consistent approximation (SCA) method that corresponds to isolated pore spaces, and the low frequency SCA-Gassmann method that describes well-connected pore spaces. Inversion results indicate that compared to the surroundings, the Barnett Shale shows less fluctuation in the pore aspect ratio in spite of complex constituents in the shale. The high frequency method provides a more robust and accurate prediction of Vs for all the three intervals in the Barnett formation, while the low frequency method collapses for the Barnett Shale interval. Possible causes for this discrepancy can be explained by the fact that poor in situ pore connectivity and low permeability make well-log sonic frequencies act as high frequencies and thus invalidate the low frequency assumption of the Gassmann theory. In comparison, for the overlying Marble Falls and underlying Ellenburger carbonates, both the high and low frequency methods predict Vs with reasonable accuracy, which may reveal that sonic frequencies are within the transition frequencies zone due to higher pore connectivity in the surroundings.
Galton's legacy to research on intelligence.
Jensen, Arthur R
2002-04-01
In the 1999 Galton Lecture for the annual conference of The Galton Institute, the author summarizes the main elements of Galton's ideas about human mental ability and the research paradigm they generated, including the concept of 'general' mental ability, its hereditary component, its physical basis, racial differences, and methods for measuring individual differences in general ability. Although the conclusions Galton drew from his empirical studies were seldom compelling for lack of the needed technology and methods of statistical inference in his day, contemporary research has generally borne out most of Galton's original and largely intuitive ideas, which still inspire mainstream scientific research on intelligence.
Thermodynamic Modeling of Donor Splice Site Recognition in pre-mRNA
NASA Astrophysics Data System (ADS)
Aalberts, Daniel P.; Garland, Jeffrey A.
2004-03-01
When eukaryotic genes are edited by the spliceosome, the first step in intron recognition is the binding of a U1 snRNA with the donor (5') splice site. We model this interaction thermodynamically to identify splice sites. Applied to a set of 65 annotated genes, our Finding with Binding method achieves a significant separation between real and false sites. Analyzing binding patterns allows us to discard a large number of decoy sites. Our results improve statistics-based methods for donor site recognition, demonstrating the promise of physical modeling to find functional elements in the genome.
Thermodynamic modeling of donor splice site recognition in pre-mRNA
NASA Astrophysics Data System (ADS)
Garland, Jeffrey A.; Aalberts, Daniel P.
2004-04-01
When eukaryotic genes are edited by the spliceosome, the first step in intron recognition is the binding of a U1 small nuclear RNA with the donor ( 5' ) splice site. We model this interaction thermodynamically to identify splice sites. Applied to a set of 65 annotated genes, our “finding with binding” method achieves a significant separation between real and false sites. Analyzing binding patterns allows us to discard a large number of decoy sites. Our results improve statistics-based methods for donor site recognition, demonstrating the promise of physical modeling to find functional elements in the genome.
14 CFR 1275.101 - Definitions.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., biology, engineering and physical sciences (physics and chemistry). (h) Inquiry means the assessment of..., social sciences, statistics, and biological and physical research (ground based and microgravity...
Information and material flows in complex networks
NASA Astrophysics Data System (ADS)
Helbing, Dirk; Armbruster, Dieter; Mikhailov, Alexander S.; Lefeber, Erjen
2006-04-01
In this special issue, an overview of the Thematic Institute (TI) on Information and Material Flows in Complex Systems is given. The TI was carried out within EXYSTENCE, the first EU Network of Excellence in the area of complex systems. Its motivation, research approach and subjects are presented here. Among the various methods used are many-particle and statistical physics, nonlinear dynamics, as well as complex systems, network and control theory. The contributions are relevant for complex systems as diverse as vehicle and data traffic in networks, logistics, production, and material flows in biological systems. The key disciplines involved are socio-, econo-, traffic- and bio-physics, and a new research area that could be called “biologistics”.
Spatio-temporal Eigenvector Filtering: Application on Bioenergy Crop Impacts
NASA Astrophysics Data System (ADS)
Wang, M.; Kamarianakis, Y.; Georgescu, M.
2017-12-01
A suite of 10-year ensemble-based simulations was conducted to investigate the hydroclimatic impacts due to large-scale deployment of perennial bioenergy crops across the continental United States. Given the large size of the simulated dataset (about 60Tb), traditional hierarchical spatio-temporal statistical modelling cannot be implemented for the evaluation of physics parameterizations and biofuel impacts. In this work, we propose a filtering algorithm that takes into account the spatio-temporal autocorrelation structure of the data while avoiding spatial confounding. This method is used to quantify the robustness of simulated hydroclimatic impacts associated with bioenergy crops to alternative physics parameterizations and observational datasets. Results are evaluated against those obtained from three alternative Bayesian spatio-temporal specifications.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
National transportation statistics 2011
DOT National Transportation Integrated Search
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National transportation statistics 2005
DOT National Transportation Integrated Search
2005-12-01
Compiled and published by the U.S. Department of Transportations Bureau of : Transportation Statistics (BTS), National Transportation Statistics 2004 presents : information on the U.S. transportation system, including its physical components, : sa...
National transportation statistics 2006
DOT National Transportation Integrated Search
2006-12-01
Compiled and published by the U.S. Department of Transportations Bureau of : Transportation Statistics (BTS), National Transportation Statistics 2006 presents : information on the U.S. transportation system, including its physical components, : sa...
National Transportation Statistics 2009
DOT National Transportation Integrated Search
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2004
DOT National Transportation Integrated Search
2005-01-01
Compiled and published by the U.S. Department of Transportations Bureau of : Transportation Statistics (BTS), National Transportation Statistics 2004 presents : information on the U.S. transportation system, including its physical components, : sa...
Transportation statistics annual report 1999
DOT National Transportation Integrated Search
1999-01-01
The Bureau of Transportation Statistics (BTS) presents the sixth : Transportation Statistics Annual Report. Mandated by Congress, the report : discusses the U.S. transportation system, including its physical components, : economic performance, safety...
National Transportation Statistics 2007
DOT National Transportation Integrated Search
2007-04-12
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
National Transportation Statistics 2008
DOT National Transportation Integrated Search
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
Clinical Validation of the "Sedentary Lifestyle" Nursing Diagnosis in Secondary School Students.
de Oliveira, Marcos Renato; da Silva, Viviane Martins; Guedes, Nirla Gomes; de Oliveira Lopes, Marcos Venícios
2016-06-01
This study clinically validated the nursing diagnosis of "sedentary lifestyle" (SL) among 564 Brazilian adolescents. Measures of diagnostic accuracy were calculated for defining characteristics, and Mantel-Haenszel analysis was used to identify related factors. The measures of diagnostic accuracy showed that the following defining characteristics were statistically significant: "average daily physical activity less than recommended for gender and age," "preference for activity low in physical activity," "nonengagement in leisure time physical activities," and "diminished respiratory capacity." An SL showed statistically significant associations with the following related factors: insufficient motivation for physical activity; insufficient interest in physical activity; insufficient resources for physical activity; insufficient social support for physical activity; attitudes, beliefs, and health habits that hinder physical activity; and insufficient confidence for practicing physical exercises. The study highlighted the four defining characteristics and six related factors for making decisions related to SL among adolescents. © The Author(s) 2015.
Ea, Vuthy; Sexton, Tom; Gostan, Thierry; Herviou, Laurie; Baudement, Marie-Odile; Zhang, Yunzhe; Berlivet, Soizik; Le Lay-Taha, Marie-Noëlle; Cathala, Guy; Lesne, Annick; Victor, Jean-Marc; Fan, Yuhong; Cavalli, Giacomo; Forné, Thierry
2015-08-15
In higher eukaryotes, the genome is partitioned into large "Topologically Associating Domains" (TADs) in which the chromatin displays favoured long-range contacts. While a crumpled/fractal globule organization has received experimental supports at higher-order levels, the organization principles that govern chromatin dynamics within these TADs remain unclear. Using simple polymer models, we previously showed that, in mouse liver cells, gene-rich domains tend to adopt a statistical helix shape when no significant locus-specific interaction takes place. Here, we use data from diverse 3C-derived methods to explore chromatin dynamics within mouse and Drosophila TADs. In mouse Embryonic Stem Cells (mESC), that possess large TADs (median size of 840 kb), we show that the statistical helix model, but not globule models, is relevant not only in gene-rich TADs, but also in gene-poor and gene-desert TADs. Interestingly, this statistical helix organization is considerably relaxed in mESC compared to liver cells, indicating that the impact of the constraints responsible for this organization is weaker in pluripotent cells. Finally, depletion of histone H1 in mESC alters local chromatin flexibility but not the statistical helix organization. In Drosophila, which possesses TADs of smaller sizes (median size of 70 kb), we show that, while chromatin compaction and flexibility are finely tuned according to the epigenetic landscape, chromatin dynamics within TADs is generally compatible with an unconstrained polymer configuration. Models issued from polymer physics can accurately describe the organization principles governing chromatin dynamics in both mouse and Drosophila TADs. However, constraints applied on this dynamics within mammalian TADs have a peculiar impact resulting in a statistical helix organization.
Choi, Se Y; Ahn, Seung H; Choi, Jae D; Kim, Jung H; Lee, Byoung-Il; Kim, Jeong-In
2016-01-01
Objective: The purpose of this study was to compare CT image quality for evaluating urolithiasis using filtered back projection (FBP), statistical iterative reconstruction (IR) and knowledge-based iterative model reconstruction (IMR) according to various scan parameters and radiation doses. Methods: A 5 × 5 × 5 mm3 uric acid stone was placed in a physical human phantom at the level of the pelvis. 3 tube voltages (120, 100 and 80 kV) and 4 current–time products (100, 70, 30 and 15 mAs) were implemented in 12 scans. Each scan was reconstructed with FBP, statistical IR (Levels 5–7) and knowledge-based IMR (soft-tissue Levels 1–3). The radiation dose, objective image quality and signal-to-noise ratio (SNR) were evaluated, and subjective assessments were performed. Results: The effective doses ranged from 0.095 to 2.621 mSv. Knowledge-based IMR showed better objective image noise and SNR than did FBP and statistical IR. The subjective image noise of FBP was worse than that of statistical IR and knowledge-based IMR. The subjective assessment scores deteriorated after a break point of 100 kV and 30 mAs. Conclusion: At the setting of 100 kV and 30 mAs, the radiation dose can be decreased by approximately 84% while keeping the subjective image assessment. Advances in knowledge: Patients with urolithiasis can be evaluated with ultralow-dose non-enhanced CT using a knowledge-based IMR algorithm at a substantially reduced radiation dose with the imaging quality preserved, thereby minimizing the risks of radiation exposure while providing clinically relevant diagnostic benefits for patients. PMID:26577542
NASA Astrophysics Data System (ADS)
Kokkinaki, A.; Sleep, B. E.; Chambers, J. E.; Cirpka, O. A.; Nowak, W.
2010-12-01
Electrical Resistance Tomography (ERT) is a popular method for investigating subsurface heterogeneity. The method relies on measuring electrical potential differences and obtaining, through inverse modeling, the underlying electrical conductivity field, which can be related to hydraulic conductivities. The quality of site characterization strongly depends on the utilized inversion technique. Standard ERT inversion methods, though highly computationally efficient, do not consider spatial correlation of soil properties; as a result, they often underestimate the spatial variability observed in earth materials, thereby producing unrealistic subsurface models. Also, these methods do not quantify the uncertainty of the estimated properties, thus limiting their use in subsequent investigations. Geostatistical inverse methods can be used to overcome both these limitations; however, they are computationally expensive, which has hindered their wide use in practice. In this work, we compare a standard Gauss-Newton smoothness constrained least squares inversion method against the quasi-linear geostatistical approach using the three-dimensional ERT dataset of the SABRe (Source Area Bioremediation) project. The two methods are evaluated for their ability to: a) produce physically realistic electrical conductivity fields that agree with the wide range of data available for the SABRe site while being computationally efficient, and b) provide information on the spatial statistics of other parameters of interest, such as hydraulic conductivity. To explore the trade-off between inversion quality and computational efficiency, we also employ a 2.5-D forward model with corrections for boundary conditions and source singularities. The 2.5-D model accelerates the 3-D geostatistical inversion method. New adjoint equations are developed for the 2.5-D forward model for the efficient calculation of sensitivities. Our work shows that spatial statistics can be incorporated in large-scale ERT inversions to improve the inversion results without making them computationally prohibitive.
National transportation statistics 2000
DOT National Transportation Integrated Search
2000-04-13
Compiled and published by the Bureau of Transportation Statistics (BTS), U.S. Department of Transportation, National Transportation Statistics 2000 presents information on the U.S. transportation system, including its physical components, safety reco...
National transportation statistics 2003
DOT National Transportation Integrated Search
2004-03-01
Compiled and published by the Bureau of Transportation Statistics (BTS), U.S. : Department of Transportation, National Transportation Statistics 2002 presents : information on the U.S. transportation system1, including its physical components, : safe...
National transportation statistics 2010
DOT National Transportation Integrated Search
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
National transportation statistics 2002
DOT National Transportation Integrated Search
2002-12-01
Compiled and published by the Bureau of Transportation Statistics (BTS), U.S. : Department of Transportation, National Transportation Statistics 2002 presents : information on the U.S. transportation system1, including its physical components, : safe...
National Transportation Statistics 2000
DOT National Transportation Integrated Search
2001-04-01
Compiled and published by the Bureau of Transportation Statistics (BTS), U.S. Department of Transportation, National Transportation Statistics 2000 presents information on the U.S. transportation system, including its physical components, safety reco...
NASA Technical Reports Server (NTRS)
Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.;
2015-01-01
The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.
Direct Statistical Simulation of Astrophysical and Geophysical Flows
NASA Astrophysics Data System (ADS)
Marston, B.; Tobias, S.
2011-12-01
Astrophysical and geophysical flows are amenable to direct statistical simulation (DSS), the calculation of statistical properties that does not rely upon accumulation by direct numerical simulation (DNS) (Tobias and Marston, 2011). Anisotropic and inhomogeneous flows, such as those found in the atmospheres of planets, in rotating stars, and in disks, provide the starting point for an expansion in fluctuations about the mean flow, leading to a hierarchy of equations of motion for the equal-time cumulants. The method is described for a general set of evolution equations, and then illustrated for two specific cases: (i) A barotropic jet on a rotating sphere (Marston, Conover, and Schneider, 2008); and (ii) A model of a stellar tachocline driven by relaxation to an underlying flow with shear (Cally 2001) for which a joint instability arises from the combination of shearing forces and magnetic stress. The reliability of DSS is assessed by comparing statistics so obtained against those accumulated from DNS, the traditional approach. The simplest non-trivial closure, CE2, sets the third and higher cumulants to zero yet yields qualitatively accurate low-order statistics for both systems. Physically CE2 retains only the eddy-mean flow interaction, and drops the eddy-eddy interaction. Quantitatively accurate zonal means are found for barotropic jet for long and short (but not intermediate) relaxation times, and for Cally problem in the case of strong shearing and large magnetic fields. Deficiencies in CE2 can be repaired at the CE3 level, that is by retaining the third cumulant (Marston 2011). We conclude by discussing possible extensions of the method both in terms of computational methods and the range of astrophysical and geophysical problems that are of interest.
Tang, Jie; Nett, Brian E; Chen, Guang-Hong
2009-10-07
Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.
ERIC Educational Resources Information Center
Leaf, Donald C., Comp.; Neely, Linda, Comp.
This edition focuses on statistical data supplied by Michigan public libraries, public library cooperatives, and those public libraries which serve as regional or subregional outlets for blind and physically handicapped services. Since statistics in Michigan academic libraries are typically collected in odd-numbered years, they are not included…
Biocultural approach of the association between maturity and physical activity in youth.
Werneck, André O; Silva, Danilo R; Collings, Paul J; Fernandes, Rômulo A; Ronque, Enio R V; Coelho-E-Silva, Manuel J; Sardinha, Luís B; Cyrino, Edilson S
2017-11-13
To test the biocultural model through direct and indirect associations between biological maturation, adiposity, cardiorespiratory fitness, feelings of sadness, social relationships, and physical activity in adolescents. This was a cross-sectional study conducted with 1,152 Brazilian adolescents aged between 10 and 17 years. Somatic maturation was estimated through Mirwald's method (peak height velocity). Physical activity was assessed through Baecke questionnaire (occupational, leisure, and sport contexts). Body mass index, body fat (sum of skinfolds), cardiorespiratory fitness (20-m shuttle run test), self-perceptions of social relationship, and frequency of sadness feelings were obtained for statistical modeling. Somatic maturation is directly related to sport practice and leisure time physical activity only among girls (β=0.12, p<0.05 and β=0.09, respectively, p<0.05). Moreover, biological (adiposity and cardiorespiratory fitness), psychological (sadness), and social (satisfaction with social relationships) variables mediated the association between maturity and physical activity in boys and for occupational physical activity in girls. In general, models presented good fit coefficients. Biocultural model presents good fit and emotional/biological factors mediate part of the relationship between somatic maturation and physical activity. Copyright © 2017 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
Physical characteristics related to bra fit.
Chen, Chin-Man; LaBat, Karen; Bye, Elizabeth
2010-04-01
Producing well-fitting garments has been a challenge for retailers and manufacturers since mass production began. Poorly fitted bras can cause discomfort or pain and result in lost sales for retailers. Because body contours are important factors affecting bra fit, this study analyses the relationship of physical characteristics to bra-fit problems. This study has used 3-D body-scanning technology to extract upper body angles from a sample of 103 college women; these data were used to categorise physical characteristics into shoulder slope, bust prominence, back curvature and acromion placement. Relationships between these physical categories and bra-fit problems were then analysed. Results show that significant main effects and two-way interactions of the physical categories exist in the fit problems of poor bra support and bra-motion restriction. The findings are valuable in helping the apparel industry create better-fitting bras. STATEMENT OF RELEVANCE: Poorly fitted bras can cause discomfort or pain and result in lost sales for retailers. The findings regarding body-shape classification provide researchers with a statistics method to quantify physical characteristics and the findings regarding the relationship analysis between physical characteristics and bra fit offer bra companies valuable information about bra-fit perceptions attributable to women with figure variations.
Statistical Physics in the Era of Big Data
ERIC Educational Resources Information Center
Wang, Dashun
2013-01-01
With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…
Factors of physical activity among Chinese children and adolescents: a systematic review.
Lu, Congchao; Stolk, Ronald P; Sauer, Pieter J J; Sijtsma, Anna; Wiersma, Rikstje; Huang, Guowei; Corpeleijn, Eva
2017-03-21
Lack of physical activity is a growing problem in China, due to the fast economic development and changing living environment over the past two decades. The aim of this review is to summarize the factors related to physical activity in Chinese children and adolescents during this distinct period of development. A systematic search was finished on Jan 10 th , 2017, and identified 2200 hits through PubMed and Web of Science. English-language published studies were included if they reported statistical associations between factors and physical activity. Adapted criteria from the Strengthening The Reporting of OBservational studies in Epidemiology (STROBE) statement and evaluation of the quality of prognosis studies in systematic reviews (QUIPS) were used to assess the risk of bias of the included studies. Related factors that were reported in at least three studies were summarized separately for children and adolescents using a semi-quantitative method. Forty two papers (published 2002-2016) were included. Most designs were cross-sectional (79%), and most studies used questionnaires to assess physical activity. Sample size was above 1000 in 18 papers (43%). Thirty seven studies (88%) showed acceptable quality by methodological quality assessment. Most studies reported a low level of physical activity. Boys were consistently more active than girls, the parental physical activity was positively associated with children and adolescents' physical activity, children in suburban/rural regions showed less activity than in urban regions, and, specifically in adolescents, self-efficacy was positively associated with physical activity. Family socioeconomic status and parental education were not associated with physical activity in children and adolescents. The studies included in this review were large but mostly of low quality in terms of study design (cross-sectional) and methods (questionnaires). Parental physical activity and self-efficacy are promising targets for future physical activity promotion programmes. The low level of physical activity raises concern, especially in suburban/rural regions. Future research is required to enhance our understanding of other influences, such as the physical environment, especially in early childhood.
Krutulyte, Grazina; Kimtys, Algimantas; Krisciūnas, Aleksandras
2003-01-01
The purpose of this study was to examine whether two different physiotherapy regimes caused any differences in outcome in the rehabilitation after stroke. We examined 240 patients with stroke. Examination was carried out at the Rehabilitation Center of Kaunas Second Clinical Hospital. Patients were divided into 2 groups: Bobath method was applied to the first (I) group (n=147), motor relearning program (MRP) method was applied to the second (II) group (n=93). In every group of patients we established samples according to sex, age, hospitalization to rehab unit as occurrence of CVA degree of disorder (hemiplegia, hemiparesis). The mobility of patients was evaluated according to European Federation for Research in Rehabilitation (EFRR) scale. Activities of daily living were evaluated by Barthel index. Analyzed groups were evaluated before physical therapy. When preliminary analysis was carried out it proved no statically reliable differences between analyzed groups (reliability 95%). The same statistical analysis was carried out after physical therapy. The results of differences between patient groups were compared using chi(2) method. Bobath method was applied working with the first group of patients. The aim of the method is to improve quality of the affected body side's movements in order to keep both sides working as harmoniously as possible. While applying this method at work, physical therapist guides patient's body on key-points, stimulating normal postural reactions, and training normal movement pattern. MRP method was used while working with the second group patients. This method is based on movement science, biomechanics and training of functional movement. Program is based on idea that movement pattern shouldn't be trained; it must be relearned. CONCLUSION. This study indicates that physiotherapy with task-oriented strategies represented by MRP, is preferable to physiotherapy with facilitation/inhibition strategies, such the Bobath programme, in the rehabilitation of stroke patients (p< 0.05).
NASA Astrophysics Data System (ADS)
Qi, Di
Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are applied in the training phase for calibrating model errors to achieve optimal imperfect model parameters; and total statistical energy dynamics are introduced to improve the model sensitivity in the prediction phase especially when strong external perturbations are exerted. The validity of reduced-order models for predicting statistical responses and intermittency is demonstrated on a series of instructive models with increasing complexity, including the stochastic triad model, the Lorenz '96 model, and models for barotropic and baroclinic turbulence. The skillful low-order modeling methods developed here should also be useful for other applications such as efficient algorithms for data assimilation.
Abbaspour, Seddigheh; Farmanbar, Rabiollah; Njafi, Fateme; Ghiasvand, Arezoo Mohamadkhani; Dehghankar, Leila
2017-01-01
Background Regular physical activity has been considered as health promotion, and identifying different effective psycho-social variables on physical has proven to be essential. Objective To identify the relationship between decisional balance and self-efficacy in physical activities using the transtheoretical model in the members of a retirement center in Rasht, Guillen. Methods A descriptive cross-sectional study was conducted in 2013 by using convenient sampling on 262 elderly people who are the members of retirement centers in Rasht. Data were collected using Stages of change, Decisional balance, Self-efficacy and Physical Activity Scale for the Elderly (PASE). Data was analyzed using SPSS-16 software, descriptive and analytic statistic (Pearson correlation, Spearman, ANOVA, HSD Tukey, linear and ordinal regression). Results The majority of participants were in maintenance stage. Mean and standard deviation physical activity for the elderly was 119.35±51.50. Stages of change and physical activities were significantly associated with decisional balance and self-efficacy (p<0.0001); however, cons had a significant and reverse association. According to linear and ordinal regression the only predicator variable of physical activity behavior was self-efficacy. Conclusion By increase in pros and self-efficacy on doing physical activity, it can be benefited in designing appropriate intervention programs. PMID:28713520
14 CFR § 1275.101 - Definitions.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., biology, engineering and physical sciences (physics and chemistry). (h) Inquiry means the assessment of..., psychology, social sciences, statistics, and biological and physical research (ground based and microgravity...
NASA Astrophysics Data System (ADS)
Vrac, Mathieu
2018-06-01
Climate simulations often suffer from statistical biases with respect to observations or reanalyses. It is therefore common to correct (or adjust) those simulations before using them as inputs into impact models. However, most bias correction (BC) methods are univariate and so do not account for the statistical dependences linking the different locations and/or physical variables of interest. In addition, they are often deterministic, and stochasticity is frequently needed to investigate climate uncertainty and to add constrained randomness to climate simulations that do not possess a realistic variability. This study presents a multivariate method of rank resampling for distributions and dependences (R2D2) bias correction allowing one to adjust not only the univariate distributions but also their inter-variable and inter-site dependence structures. Moreover, the proposed R2D2 method provides some stochasticity since it can generate as many multivariate corrected outputs as the number of statistical dimensions (i.e., number of grid cell × number of climate variables) of the simulations to be corrected. It is based on an assumption of stability in time of the dependence structure - making it possible to deal with a high number of statistical dimensions - that lets the climate model drive the temporal properties and their changes in time. R2D2 is applied on temperature and precipitation reanalysis time series with respect to high-resolution reference data over the southeast of France (1506 grid cell). Bivariate, 1506-dimensional and 3012-dimensional versions of R2D2 are tested over a historical period and compared to a univariate BC. How the different BC methods behave in a climate change context is also illustrated with an application to regional climate simulations over the 2071-2100 period. The results indicate that the 1d-BC basically reproduces the climate model multivariate properties, 2d-R2D2 is only satisfying in the inter-variable context, 1506d-R2D2 strongly improves inter-site properties and 3012d-R2D2 is able to account for both. Applications of the proposed R2D2 method to various climate datasets are relevant for many impact studies. The perspectives of improvements are numerous, such as introducing stochasticity in the dependence itself, questioning its stability assumption, and accounting for temporal properties adjustment while including more physics in the adjustment procedures.
Detecting anomalies in CMB maps: a new method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neelakanta, Jayanth T., E-mail: jayanthtn@gmail.com
2015-10-01
Ever since WMAP announced its first results, different analyses have shown that there is weak evidence for several large-scale anomalies in the CMB data. While the evidence for each anomaly appears to be weak, the fact that there are multiple seemingly unrelated anomalies makes it difficult to account for them via a single statistical fluke. So, one is led to considering a combination of these anomalies. But, if we ''hand-pick'' the anomalies (test statistics) to consider, we are making an a posteriori choice. In this article, we propose two statistics that do not suffer from this problem. The statistics aremore » linear and quadratic combinations of the a{sub ℓ m}'s with random co-efficients, and they test the null hypothesis that the a{sub ℓ m}'s are independent, normally-distributed, zero-mean random variables with an m-independent variance. The motivation for considering multiple modes is this: because most physical models that lead to large-scale anomalies result in coupling multiple ℓ and m modes, the ''coherence'' of this coupling should get enhanced if a combination of different modes is considered. In this sense, the statistics are thus much more generic than those that have been hitherto considered in literature. Using fiducial data, we demonstrate that the method works and discuss how it can be used with actual CMB data to make quite general statements about the incompatibility of the data with the null hypothesis.« less
Combining Statistics and Physics to Improve Climate Downscaling
NASA Astrophysics Data System (ADS)
Gutmann, E. D.; Eidhammer, T.; Arnold, J.; Nowak, K.; Clark, M. P.
2017-12-01
Getting useful information from climate models is an ongoing problem that has plagued climate science and hydrologic prediction for decades. While it is possible to develop statistical corrections for climate models that mimic current climate almost perfectly, this does not necessarily guarantee that future changes are portrayed correctly. In contrast, convection permitting regional climate models (RCMs) have begun to provide an excellent representation of the regional climate system purely from first principles, providing greater confidence in their change signal. However, the computational cost of such RCMs prohibits the generation of ensembles of simulations or long time periods, thus limiting their applicability for hydrologic applications. Here we discuss a new approach combining statistical corrections with physical relationships for a modest computational cost. We have developed the Intermediate Complexity Atmospheric Research model (ICAR) to provide a climate and weather downscaling option that is based primarily on physics for a fraction of the computational requirements of a traditional regional climate model. ICAR also enables the incorporation of statistical adjustments directly within the model. We demonstrate that applying even simple corrections to precipitation while the model is running can improve the simulation of land atmosphere feedbacks in ICAR. For example, by incorporating statistical corrections earlier in the modeling chain, we permit the model physics to better represent the effect of mountain snowpack on air temperature changes.
Generalized statistical mechanics approaches to earthquakes and tectonics.
Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios
2016-12-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Modelling the hydraulic conductivity of porous media using physical-statistical model
NASA Astrophysics Data System (ADS)
Usowicz, B.; Usowicz, L. B.; Lipiec, J.
2009-04-01
Soils and other porous media can be represented by a pattern (net) of more or less cylindrically interconnected channels. The capillary radius, r can represent an elementary capillary formed in between soil particles in one case, and in another case it can represent a mean hydrodynamic radius. When we view a porous medium as a net of interconnected capillaries, we can apply a statistical approach for the description of the liquid or gas flow. A soil phase is included in the porous medium and its configuration is decisive for pore distribution in this medium and hence, it conditions the course of the water retention curve of this medium. In this work method of estimating hydraulic conductivity of porous media based on physical-statistical model proposed by B. Usowicz is presented. The physical-statistical model considers the pore space as the capillary net. The net of capillary connections is represented by parallel and serial connections of hydraulic resistors in the layer and between the layers, respectively. The polynomial distribution was used in this model to determine probability of the occurrence of a given capillary configuration. The model was calibrated using measured water retention curve and two values of hydraulic conductivity saturated and unsaturated and model parameters were determined. The model was used for predicting hydraulic conductivity as a function of soil water content K(theta). The model was validated by comparing the measured and predicted K data for various soils and other porous media (e.g. sandstone). A good agreement between measured and predicted data was reasonable as indicated by values R2 (>0.9). It was also confirmed that the random variables used for the calculations and model parameters were chosen and selected correctly. The study was funded in part by the Polish Ministry of Science and Higher Education by Grant No. N305 046 31/1707).
Statistical physics approach to earthquake occurrence and forecasting
NASA Astrophysics Data System (ADS)
de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio
2016-04-01
There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.
2014-01-01
Background Low levels of physical activity, musculoskeletal morbidity and weight gain are commonly reported problems in children with cancer. Intensive medical treatment and a decline in physical activity may also result in reduced motor performance. Therefore, simple and inexpensive ways to promote physical activity and exercise are becoming an increasingly important part of children’s cancer treatment. Methods The aim of this study is to evaluate the effect of active video games in promotion of physical activity in children with cancer. The research is conducted as a parallel randomized clinical trial with follow-up. Patients between 3 and 16 years old, diagnosed with cancer and treated with vincristine in two specialized medical centers are asked to participate. Based on statistical estimates, the target enrollment is 40 patients. The intervention includes playing elective active video games and, in addition, education and consultations for the family. The control group will receive a general recommendation for physical activity for 30 minutes per day. The main outcomes are the amount of physical activity and sedentary behavior. Other outcomes include motor performance, fatigue and metabolic risk factors. The outcomes are examined with questionnaires, diaries, physical examinations and blood tests at baseline and at 2, 6, 12 and 30 months after the baseline. Additionally, the children’s perceptions of the most enjoyable activation methods are explored through an interview at 2 months. Discussion This trial will help to answer the question of whether playing active video games is beneficial for children with cancer. It will also provide further reasoning for physical activity promotion and training of motor skills during treatment. Trial registration ClinicalTrials.gov identifier: NCT01748058 (October 15, 2012). PMID:24708773
Saber, Fatemeh; Shanazi, Hossein; Sharifirad, Gholamreza; Hasanzadeh, Akbar
2014-01-01
Background: Sedentary life has been recognized as a serious problem in today's Iranian society. Promoting the lifestyle with increased physical activity and prevention of cardiovascular disease (CVD) is imperative. The purpose of this study was identifying the determinants of physical activity in the housewives of Nain city in 2012 based on the theory of planned behavior. Materials and Methods: In this cross-sectional study, 120 housewives were selected by simple random sampling method. Data collection tool was a questionnaire designed based on a standardized and fabricated questionnaire and consisted of four parts. The questionnaire included awareness variables, theory of structures, planned behavior, and physical activity. Data analysis was performed using the SPSS software version 18 and associated statistical tests. Findings: The 120 housewives under study had a mean age of 34.58 ± 6.86 years. The mean scores of awareness, attitude, motivation to perform, subjective norms, and perceived behavioral control variables were 74.1 ± 18.5, 82.6 ± 12.1, 59.4 ± 21.7, 63.2 ± 21.2, and 48.1 ± 12.9 respectively. There was a significant relationship between the motivation for physical activity among women and knowledge (P = 0.02) attitude (P = 0.04) subjective norms (P = 0.002) perceived behavioral control (P = 0.001), and physical activity (P = 0.04). Conclusions: It seems that the housewives, despite being aware of and having a positive attitude on the benefits of physical activity, had a poor lifestyle. Perhaps further studies can help in finding the causes of this issue and the barriers to physical activity such as the conditions and plan for greater measures for improving physical activity, in order to promote women's health which has a significant role in family and community health. PMID:25250360
Physics Trends flyers & high school flyers
NASA Astrophysics Data System (ADS)
White, Susan C.
2016-03-01
Since 2000, we have published a series of flyers highlighting various data of interest to physics faculty members and students. For example, our Fall 2015 Physics Trends flyers display the employment sectors where physics bachelor's degree recipients work, the knowledge used frequently by mid-career PhD physicists working primarily in private sector jobs, and the proportion of women among physics faculty members. We have recently added a new resource for high school physics teachers: flyers focusing on high school physics. PDFs of both the Physics Trends and high school flyers are available for download at: www.aip.org/statistics/physics-trends and www.aip.org/statistics/highschool. We also have a limited number of printed copies of the Physics Trends flyers which we are happy to send to you upon request. We appreciate the responses from each of you who has helped us collect these data. Next month we will look at Hispanic representation among bachelor's degree recipients in physical sciences and engineering.
NASA Astrophysics Data System (ADS)
Yu, Fu-Yun; Liu, Yu-Hsin
2005-09-01
The potential value of a multiple-choice question-construction instructional strategy for the support of students’ learning of physics experiments was examined in the study. Forty-two university freshmen participated in the study for a whole semester. A constant comparison method adopted to categorize students’ qualitative data indicated that the influences of multiple-choice question construction were evident in several significant ways (promoting constructive and productive studying habits; reflecting and previewing course-related materials; increasing in-group communication and interaction; breaking passive learning style and habits, etc.), which, worked together, not only enhanced students’ comprehension and retention of the obtained knowledge, but also helped distil a sense of empowerment and learning community within the participants. Analysis with one-group t-tests, using 3 as the expected mean, on quantitative data further found that students’ satisfaction toward past learning experience, and perceptions toward this strategy’s potentials for promoting learning were statistically significant at the 0.0005 level, while learning anxiety was not statistically significant. Suggestions for incorporating question-generation activities within classroom and topics for future studies were rendered.
NASA Astrophysics Data System (ADS)
Baez, J.; Lapidaryus, M.; Siegel, Edward Carl-Ludwig
2011-03-01
Riemann-hypothesis physics-proof combines: Siegel-Antonoff-Smith[AMS Joint Mtg.(2002)-Abs.973-03-126] digits on-average statistics HIll[Am. J. Math 123, 3, 887(1996)] logarithm-function's (1,0)-fixed-point base=units=scale-invariance proven Newcomb[Am. J. Math. 4, 39(1881)]-Weyl[Goett. Nachr.(1914); Math. Ann. 7, 313(1916)]-Benford[Proc. Am. Phil. Soc. 78, 4, 51(1938)]-law [Kac, Math. of Stat.-Reasoning(1955); Raimi, Sci. Am. 221, 109(1969)] algebraic-inversion to ONLY Bose-Einstein quantum-statistics(BEQS) with digit d = 0 gapFUL Bose-Einstein Condensation(BEC) insight that digits are quanta are bosons were always digits, via Siegel-Baez category-semantics tabular list-format matrix truth-table analytics in Plato-Aristotle classic "square-of-opposition" : FUZZYICS=CATEGORYICS/Category-Semantics, with Goodkind Bose-Einstein condensation(BEC) ABOVE ground-state with/and Rayleigh(cut-limit of "short-cut method";1870)-Polya(1922)-"Anderson"(1958) localization [Doyle and Snell, Random-Walks and Electrical-Networks, MAA(1981)-p.99-100!!!].
Kamalinasab, Z; Mahdavi, A; Ebrahimi, M; Vahidi Nekoo, M; Aghaei, M; Ebrahimi, F
2015-01-01
Objective: Psychological interventions for enhancing mental health in those with somatomotor-physical disabilities are vital. The existing research aimed to examine the effect of teaching stress management skills on self-esteem and behavioral adjustment in individuals with somatomotor-physical disabilities. Methodology: The method of the survey was semi-experimental with a pre-test post-test design and a control group. Hence, in Tehran, 40 girls with somatomotor-physical disabilities were selected by using convenience sampling, and they were divided into two groups: control and experiment. Both groups were tested by using a demography questionnaire, Rozenberg’s self-esteem scale, and a behavioral adjustment questionnaire. Afterwards, the test group received lessons on stress management within ten sessions, but the control group received no interventions. Then both groups were post-tested, and the collected data were analyzed by using descriptive and inferential statistics methods through SPSS software. Findings: Findings showed that teaching stress management skills significantly increased self-esteem and behavioral adjustment in girls with somatomotor-physical disabilities (p < 0.001). Conclusion: According to the study, it could be concluded that teaching stress management skills is an effective way to help endangered individuals such as girls who have somatomotor-physical disabilities because it is highly efficient especially when it is performed in groups, it is cheap, and acceptable by different people. PMID:28316725
Study protocol title: a prospective cohort study of low back pain
2013-01-01
Background Few prospective cohort studies of workplace low back pain (LBP) with quantified job physical exposure have been performed. There are few prospective epidemiological studies for LBP occupational risk factors and reported data generally have few adjustments for many personal and psychosocial factors. Methods/design A multi-center prospective cohort study has been incepted to quantify risk factors for LBP and potentially develop improved methods for designing and analyzing jobs. Due to the subjectivity of LBP, six measures of LBP are captured: 1) any LBP, 2) LBP ≥ 5/10 pain rating, 3) LBP with medication use, 4) LBP with healthcare provider visits, 5) LBP necessitating modified work duties and 6) LBP with lost work time. Workers have thus far been enrolled from 30 different employment settings in 4 diverse US states and performed widely varying work. At baseline, workers undergo laptop-administered questionnaires, structured interviews, and two standardized physical examinations to ascertain demographics, medical history, psychosocial factors, hobbies and physical activities, and current musculoskeletal disorders. All workers’ jobs are individually measured for physical factors and are videotaped. Workers are followed monthly for the development of low back pain. Changes in jobs necessitate re-measure and re-videotaping of job physical factors. The lifetime cumulative incidence of low back pain will also include those with a past history of low back pain. Incident cases will exclude prevalent cases at baseline. Statistical methods planned include survival analyses and logistic regression. Discussion Data analysis of a prospective cohort study of low back pain is underway and has successfully enrolled over 800 workers to date. PMID:23497211
Parameter Uncertainty on AGCM-simulated Tropical Cyclones
NASA Astrophysics Data System (ADS)
He, F.
2015-12-01
This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.
The Quantum and Fluid Mechanics of Global Warming
NASA Astrophysics Data System (ADS)
Marston, Brad
2008-03-01
Quantum physics and fluid mechanics are the foundation of any understanding of the Earth's climate. In this talk I invoke three well-known aspects of quantum mechanics to explore what will happen as the concentrations of greenhouse gases such as carbon dioxide continue to increase. Fluid dynamical models of the Earth's atmosphere, demonstrated here in live simulations, yield further insight into past, present, and future climates. Statistics of geophysical flows can, however, be ascertained directly without recourse to numerical simulation, using concepts borrowed from nonequilibrium statistical mechanicsootnotetextJ. B. Marston, E. Conover, and Tapio Schneider, ``Statistics of an Unstable Barotropic Jet from a Cumulant Expansion,'' arXiv:0705.0011, J. Atmos. Sci. (in press).. I discuss several other ways that theoretical physics may be able to contribute to a deeper understanding of climate changeootnotetextJ. Carlson, J. Harte, G. Falkovich, J. B. Marston, and R. Pierrehumbert, ``Physics of Climate Change'' 2008 Program of the Kavli Institute for Theoretical Physics..
A Conway-Maxwell-Poisson (CMP) model to address data dispersion on positron emission tomography.
Santarelli, Maria Filomena; Della Latta, Daniele; Scipioni, Michele; Positano, Vincenzo; Landini, Luigi
2016-10-01
Positron emission tomography (PET) in medicine exploits the properties of positron-emitting unstable nuclei. The pairs of γ- rays emitted after annihilation are revealed by coincidence detectors and stored as projections in a sinogram. It is well known that radioactive decay follows a Poisson distribution; however, deviation from Poisson statistics occurs on PET projection data prior to reconstruction due to physical effects, measurement errors, correction of deadtime, scatter, and random coincidences. A model that describes the statistical behavior of measured and corrected PET data can aid in understanding the statistical nature of the data: it is a prerequisite to develop efficient reconstruction and processing methods and to reduce noise. The deviation from Poisson statistics in PET data could be described by the Conway-Maxwell-Poisson (CMP) distribution model, which is characterized by the centring parameter λ and the dispersion parameter ν, the latter quantifying the deviation from a Poisson distribution model. In particular, the parameter ν allows quantifying over-dispersion (ν<1) or under-dispersion (ν>1) of data. A simple and efficient method for λ and ν parameters estimation is introduced and assessed using Monte Carlo simulation for a wide range of activity values. The application of the method to simulated and experimental PET phantom data demonstrated that the CMP distribution parameters could detect deviation from the Poisson distribution both in raw and corrected PET data. It may be usefully implemented in image reconstruction algorithms and quantitative PET data analysis, especially in low counting emission data, as in dynamic PET data, where the method demonstrated the best accuracy. Copyright © 2016 Elsevier Ltd. All rights reserved.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.