2015-07-15
Long-term effects on cancer survivors’ quality of life of physical training versus physical training combined with cognitive-behavioral therapy ...COMPARISON OF NEURAL NETWORK AND LINEAR REGRESSION MODELS IN STATISTICALLY PREDICTING MENTAL AND PHYSICAL HEALTH STATUS OF BREAST...34Comparison of Neural Network and Linear Regression Models in Statistically Predicting Mental and Physical Health Status of Breast Cancer Survivors
Physics-based statistical model and simulation method of RF propagation in urban environments
Pao, Hsueh-Yuan; Dvorak, Steven L.
2010-09-14
A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.
NASA Astrophysics Data System (ADS)
Nearing, G. S.
2014-12-01
Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, J.; Hoversten, G.M.
2011-09-15
Joint inversion of seismic AVA and CSEM data requires rock-physics relationships to link seismic attributes to electrical properties. Ideally, we can connect them through reservoir parameters (e.g., porosity and water saturation) by developing physical-based models, such as Gassmann’s equations and Archie’s law, using nearby borehole logs. This could be difficult in the exploration stage because information available is typically insufficient for choosing suitable rock-physics models and for subsequently obtaining reliable estimates of the associated parameters. The use of improper rock-physics models and the inaccuracy of the estimates of model parameters may cause misleading inversion results. Conversely, it is easy tomore » derive statistical relationships among seismic and electrical attributes and reservoir parameters from distant borehole logs. In this study, we develop a Bayesian model to jointly invert seismic AVA and CSEM data for reservoir parameter estimation using statistical rock-physics models; the spatial dependence of geophysical and reservoir parameters are carried out by lithotypes through Markov random fields. We apply the developed model to a synthetic case, which simulates a CO{sub 2} monitoring application. We derive statistical rock-physics relations from borehole logs at one location and estimate seismic P- and S-wave velocity ratio, acoustic impedance, density, electrical resistivity, lithotypes, porosity, and water saturation at three different locations by conditioning to seismic AVA and CSEM data. Comparison of the inversion results with their corresponding true values shows that the correlation-based statistical rock-physics models provide significant information for improving the joint inversion results.« less
Forecasting runout of rock and debris avalanches
Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.
2006-01-01
Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.
Combining Statistics and Physics to Improve Climate Downscaling
NASA Astrophysics Data System (ADS)
Gutmann, E. D.; Eidhammer, T.; Arnold, J.; Nowak, K.; Clark, M. P.
2017-12-01
Getting useful information from climate models is an ongoing problem that has plagued climate science and hydrologic prediction for decades. While it is possible to develop statistical corrections for climate models that mimic current climate almost perfectly, this does not necessarily guarantee that future changes are portrayed correctly. In contrast, convection permitting regional climate models (RCMs) have begun to provide an excellent representation of the regional climate system purely from first principles, providing greater confidence in their change signal. However, the computational cost of such RCMs prohibits the generation of ensembles of simulations or long time periods, thus limiting their applicability for hydrologic applications. Here we discuss a new approach combining statistical corrections with physical relationships for a modest computational cost. We have developed the Intermediate Complexity Atmospheric Research model (ICAR) to provide a climate and weather downscaling option that is based primarily on physics for a fraction of the computational requirements of a traditional regional climate model. ICAR also enables the incorporation of statistical adjustments directly within the model. We demonstrate that applying even simple corrections to precipitation while the model is running can improve the simulation of land atmosphere feedbacks in ICAR. For example, by incorporating statistical corrections earlier in the modeling chain, we permit the model physics to better represent the effect of mountain snowpack on air temperature changes.
Moment-Based Physical Models of Broadband Clutter due to Aggregations of Fish
2013-09-30
statistical models for signal-processing algorithm development. These in turn will help to develop a capability to statistically forecast the impact of...aggregations of fish based on higher-order statistical measures describable in terms of physical and system parameters. Environmentally , these models...processing. In this experiment, we had good ground truth on (1) and (2), and had control over (3) and (4) except for environmentally -imposed restrictions
Maximum entropy models of ecosystem functioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertram, Jason, E-mail: jason.bertram@anu.edu.au
2014-12-05
Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less
Effects of preprocessing Landsat MSS data on derived features
NASA Technical Reports Server (NTRS)
Parris, T. M.; Cicone, R. C.
1983-01-01
Important to the use of multitemporal Landsat MSS data for earth resources monitoring, such as agricultural inventories, is the ability to minimize the effects of varying atmospheric and satellite viewing conditions, while extracting physically meaningful features from the data. In general, the approaches to the preprocessing problem have been derived from either physical or statistical models. This paper compares three proposed algorithms; XSTAR haze correction, Color Normalization, and Multiple Acquisition Mean Level Adjustment. These techniques represent physical, statistical, and hybrid physical-statistical models, respectively. The comparisons are made in the context of three feature extraction techniques; the Tasseled Cap, the Cate Color Cube. and Normalized Difference.
Hart, Carl R; Reznicek, Nathan J; Wilson, D Keith; Pettit, Chris L; Nykaza, Edward T
2016-05-01
Many outdoor sound propagation models exist, ranging from highly complex physics-based simulations to simplified engineering calculations, and more recently, highly flexible statistical learning methods. Several engineering and statistical learning models are evaluated by using a particular physics-based model, namely, a Crank-Nicholson parabolic equation (CNPE), as a benchmark. Narrowband transmission loss values predicted with the CNPE, based upon a simulated data set of meteorological, boundary, and source conditions, act as simulated observations. In the simulated data set sound propagation conditions span from downward refracting to upward refracting, for acoustically hard and soft boundaries, and low frequencies. Engineering models used in the comparisons include the ISO 9613-2 method, Harmonoise, and Nord2000 propagation models. Statistical learning methods used in the comparisons include bagged decision tree regression, random forest regression, boosting regression, and artificial neural network models. Computed skill scores are relative to sound propagation in a homogeneous atmosphere over a rigid ground. Overall skill scores for the engineering noise models are 0.6%, -7.1%, and 83.8% for the ISO 9613-2, Harmonoise, and Nord2000 models, respectively. Overall skill scores for the statistical learning models are 99.5%, 99.5%, 99.6%, and 99.6% for bagged decision tree, random forest, boosting, and artificial neural network regression models, respectively.
Statistical physics of vehicular traffic and some related systems
NASA Astrophysics Data System (ADS)
Chowdhury, Debashish; Santen, Ludger; Schadschneider, Andreas
2000-05-01
In the so-called “microscopic” models of vehicular traffic, attention is paid explicitly to each individual vehicle each of which is represented by a “particle”; the nature of the “interactions” among these particles is determined by the way the vehicles influence each others’ movement. Therefore, vehicular traffic, modeled as a system of interacting “particles” driven far from equilibrium, offers the possibility to study various fundamental aspects of truly nonequilibrium systems which are of current interest in statistical physics. Analytical as well as numerical techniques of statistical physics are being used to study these models to understand rich variety of physical phenomena exhibited by vehicular traffic. Some of these phenomena, observed in vehicular traffic under different circumstances, include transitions from one dynamical phase to another, criticality and self-organized criticality, metastability and hysteresis, phase-segregation, etc. In this critical review, written from the perspective of statistical physics, we explain the guiding principles behind all the main theoretical approaches. But we present detailed discussions on the results obtained mainly from the so-called “particle-hopping” models, particularly emphasizing those which have been formulated in recent years using the language of cellular automata.
Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment
NASA Astrophysics Data System (ADS)
Brietzke, G. B.; Hainzl, S.; Zöller, G.
2012-04-01
As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).
NASA Astrophysics Data System (ADS)
Preibus-Norquist, R. N. C.-Grover; Bush-Romney, G. W.-Willard-Mitt; Dimon, J. P.; Adelson-Koch, Sheldon-Charles-David-Sheldon; Krugman-Axelrod, Paul-David; Siegel, Edward Carl-Ludwig; D. N. C./O. F. P./''47''%/50% Collaboration; R. N. C./G. O. P./''53''%/49% Collaboration; Nyt/Wp/Cnn/Msnbc/Pbs/Npr/Ft Collaboration; Ftn/Fnc/Fox/Wsj/Fbn Collaboration; Lb/Jpmc/Bs/Boa/Ml/Wamu/S&P/Fitch/Moodys/Nmis Collaboration
2013-03-01
``Models''? CAVEAT EMPTOR!!!: ``Toy Models Too-Often Yield Toy-Results''!!!: Goldenfeld[``The Role of Models in Physics'', in Lects.on Phase-Transitions & R.-G.(92)-p.32-33!!!]: statistics(Silver{[NYTimes; Bensinger, ``Math-Geerks Clearly-Defeated Pundits'', LATimes, (11/9/12)])}, polls, politics, economics, elections!!!: GRAPH/network/net/...-PHYSICS Barabasi-Albert[RMP (02)] (r,t)-space VERSUS(???) [Where's the Inverse/ Dual/Integral-Transform???] (Benjamin)Franklin(1795)-Fourier(1795; 1897;1822)-Laplace(1850)-Mellin (1902) Brillouin(1922)-...(k,)-space, {Hubbard [The World According to Wavelets,Peters (96)-p.14!!!/p.246: refs.-F2!!!]},and then (2) Albert-Barabasi[]Bose-Einstein quantum-statistics(BEQS) Bose-Einstein CONDENSATION (BEC) versus Bianconi[pvt.-comm.; arXiv:cond-mat/0204506; ...] -Barabasi [???] Fermi-Dirac
Statistically Modeling I-V Characteristics of CNT-FET with LASSO
NASA Astrophysics Data System (ADS)
Ma, Dongsheng; Ye, Zuochang; Wang, Yan
2017-08-01
With the advent of internet of things (IOT), the need for studying new material and devices for various applications is increasing. Traditionally we build compact models for transistors on the basis of physics. But physical models are expensive and need a very long time to adjust for non-ideal effects. As the vision for the application of many novel devices is not certain or the manufacture process is not mature, deriving generalized accurate physical models for such devices is very strenuous, whereas statistical modeling is becoming a potential method because of its data oriented property and fast implementation. In this paper, one classical statistical regression method, LASSO, is used to model the I-V characteristics of CNT-FET and a pseudo-PMOS inverter simulation based on the trained model is implemented in Cadence. The normalized relative mean square prediction error of the trained model versus experiment sample data and the simulation results show that the model is acceptable for digital circuit static simulation. And such modeling methodology can extend to general devices.
NASA Astrophysics Data System (ADS)
Oliveira, Sérgio C.; Zêzere, José L.; Lajas, Sara; Melo, Raquel
2017-07-01
Approaches used to assess shallow slide susceptibility at the basin scale are conceptually different depending on the use of statistical or physically based methods. The former are based on the assumption that the same causes are more likely to produce the same effects, whereas the latter are based on the comparison between forces which tend to promote movement along the slope and the counteracting forces that are resistant to motion. Within this general framework, this work tests two hypotheses: (i) although conceptually and methodologically distinct, the statistical and deterministic methods generate similar shallow slide susceptibility results regarding the model's predictive capacity and spatial agreement; and (ii) the combination of shallow slide susceptibility maps obtained with statistical and physically based methods, for the same study area, generate a more reliable susceptibility model for shallow slide occurrence. These hypotheses were tested at a small test site (13.9 km2) located north of Lisbon (Portugal), using a statistical method (the information value method, IV) and a physically based method (the infinite slope method, IS). The landslide susceptibility maps produced with the statistical and deterministic methods were combined into a new landslide susceptibility map. The latter was based on a set of integration rules defined by the cross tabulation of the susceptibility classes of both maps and analysis of the corresponding contingency tables. The results demonstrate a higher predictive capacity of the new shallow slide susceptibility map, which combines the independent results obtained with statistical and physically based models. Moreover, the combination of the two models allowed the identification of areas where the results of the information value and the infinite slope methods are contradictory. Thus, these areas were classified as uncertain and deserve additional investigation at a more detailed scale.
An order statistics approach to the halo model for galaxies
NASA Astrophysics Data System (ADS)
Paul, Niladri; Paranjape, Aseem; Sheth, Ravi K.
2017-04-01
We use the halo model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models - one in which this luminosity function p(L) is universal - naturally produces a number of features associated with previous analyses based on the 'central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the lognormal distribution around this mean and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering; however, this model predicts no luminosity dependence of large-scale clustering. We then show that an extended version of this model, based on the order statistics of a halo mass dependent luminosity function p(L|m), is in much better agreement with the clustering data as well as satellite luminosities, but systematically underpredicts central luminosities. This brings into focus the idea that central galaxies constitute a distinct population that is affected by different physical processes than are the satellites. We model this physical difference as a statistical brightening of the central luminosities, over and above the order statistics prediction. The magnitude gap between the brightest and second brightest group galaxy is predicted as a by-product, and is also in good agreement with observations. We propose that this order statistics framework provides a useful language in which to compare the halo model for galaxies with more physically motivated galaxy formation models.
NASA Astrophysics Data System (ADS)
Lafontaine, J.; Hay, L.; Archfield, S. A.; Farmer, W. H.; Kiang, J. E.
2014-12-01
The U.S. Geological Survey (USGS) has developed a National Hydrologic Model (NHM) to support coordinated, comprehensive and consistent hydrologic model development, and facilitate the application of hydrologic simulations within the continental US. The portion of the NHM located within the Gulf Coastal Plains and Ozarks Landscape Conservation Cooperative (GCPO LCC) is being used to test the feasibility of improving streamflow simulations in gaged and ungaged watersheds by linking statistically- and physically-based hydrologic models. The GCPO LCC covers part or all of 12 states and 5 sub-geographies, totaling approximately 726,000 km2, and is centered on the lower Mississippi Alluvial Valley. A total of 346 USGS streamgages in the GCPO LCC region were selected to evaluate the performance of this new calibration methodology for the period 1980 to 2013. Initially, the physically-based models are calibrated to measured streamflow data to provide a baseline for comparison. An enhanced calibration procedure then is used to calibrate the physically-based models in the gaged and ungaged areas of the GCPO LCC using statistically-based estimates of streamflow. For this application, the calibration procedure is adjusted to address the limitations of the statistically generated time series to reproduce measured streamflow in gaged basins, primarily by incorporating error and bias estimates. As part of this effort, estimates of uncertainty in the model simulations are also computed for the gaged and ungaged watersheds.
A new model of physical evolution of Jupiter-family comets
NASA Astrophysics Data System (ADS)
Rickman, H.; Szutowicz, S.; Wójcikowski, K.
2014-07-01
We aim to find the statistical physical lifetimes of Jupiter Family comets. For this purpose, we try to model the processes that govern the dynamical and physical evolution of comets. We pay special attention to physical evolution; attempts at such modelling have been made before, but we propose a more accurate model, which will include more physical effects. The model is tested on a sample of fictitious comets based on real Jupiter Family comets with some orbital elements changed to a state before the capture by Jupiter. We model four different physical effects: erosion by sublimation, dust mantling, rejuvenation (mantle blow-off), and splitting. While for sublimation and splitting there already are some models, like di Sisto et. al. (2009), and we only wish to make them more accurate, dust mantling and rejuvenation have not been included in previous, statistical physical evolution models. Each of these effects depends on one or more tunable parameters, which we establish by choosing the model that best fits the observed comet sample in a way similar to di Sisto et. al. (2009). In contrast to di Sisto et. al., our comparison also involves the observed active fractions vs. nuclear radii.
NASA Astrophysics Data System (ADS)
Moslehi, M.; de Barros, F.; Rajagopal, R.
2014-12-01
Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.
Equilibrium statistical-thermal models in high-energy physics
NASA Astrophysics Data System (ADS)
Tawfik, Abdel Nasser
2014-05-01
We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical-thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.
Statistical physics of the symmetric group.
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
Statistical physics of the symmetric group
NASA Astrophysics Data System (ADS)
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
A Statistical-Physics Approach to Language Acquisition and Language Change
NASA Astrophysics Data System (ADS)
Cassandro, Marzio; Collet, Pierre; Galves, Antonio; Galves, Charlotte
1999-02-01
The aim of this paper is to explain why Statistical Physics can help understanding two related linguistic questions. The first question is how to model first language acquisition by a child. The second question is how language change proceeds in time. Our approach is based on a Gibbsian model for the interface between syntax and prosody. We also present a simulated annealing model of language acquisition, which extends the Triggering Learning Algorithm recently introduced in the linguistic literature.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-06-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-01-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370
A Bayesian approach for parameter estimation and prediction using a computationally intensive model
Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...
2015-02-05
Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less
WE-A-201-02: Modern Statistical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niemierko, A.
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
WE-A-201-00: Anne and Donald Herbert Distinguished Lectureship On Modern Statistical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
NASA Astrophysics Data System (ADS)
Aouaini, F.; Knani, S.; Ben Yahia, M.; Ben Lamine, A.
2015-08-01
Water sorption isotherms of foodstuffs are very important in different areas of food science engineering such as for design, modeling and optimization of many processes. The equilibrium moisture content is an important parameter in models used to predict changes in the moisture content of a product during storage. A formulation of multilayer model with two energy levels was based on statistical physics and theoretical considerations. Thanks to the grand canonical ensemble in statistical physics. Some physicochemical parameters related to the adsorption process were introduced in the analytical model expression. The data tabulated in literature of water adsorption at different temperatures on: chickpea seeds, lentil seeds, potato and on green peppers were described applying the most popular models applied in food science. We also extend the study to the newest proposed model. It is concluded that among studied models the proposed model seems to be the best for description of data in the whole range of relative humidity. By using our model, we were able to determine the thermodynamic functions. The measurement of desorption isotherms, in particular a gas over a solid porous, allows access to the distribution of pore size PSD.
The effects of modeling instruction on high school physics academic achievement
NASA Astrophysics Data System (ADS)
Wright, Tiffanie L.
The purpose of this study was to explore whether Modeling Instruction, compared to traditional lecturing, is an effective instructional method to promote academic achievement in selected high school physics classes at a rural middle Tennessee high school. This study used an ex post facto , quasi-experimental research methodology. The independent variables in this study were the instructional methods of teaching. The treatment variable was Modeling Instruction and the control variable was traditional lecture instruction. The Treatment Group consisted of participants in Physical World Concepts who received Modeling Instruction. The Control Group consisted of participants in Physical Science who received traditional lecture instruction. The dependent variable was gains scores on the Force Concepts Inventory (FCI). The participants for this study were 133 students each in both the Treatment and Control Groups (n = 266), who attended a public, high school in rural middle Tennessee. The participants were administered the Force Concepts Inventory (FCI) prior to being taught the mechanics of physics. The FCI data were entered into the computer-based Statistical Package for the Social Science (SPSS). Two independent samples t-tests were conducted to answer the research questions. There was a statistically significant difference between the treatment and control groups concerning the instructional method. Modeling Instructional methods were found to be effective in increasing the academic achievement of students in high school physics. There was no statistically significant difference between FCI gains scores for gender. Gender was found to have no effect on the academic achievement of students in high school physics classes. However, even though there was not a statistically significant difference, female students' gains scores were higher than male students' gains scores when Modeling Instructional methods of teaching were used. Based on these findings, it is recommended that high school science teachers should use Modeling Instructional methods of teaching daily in their classrooms. A recommendation for further research is to expand the Modeling Instructional methods of teaching into different content areas, (i.e., reading and language arts) to explore academic achievement gains.
On entropy, financial markets and minority games
NASA Astrophysics Data System (ADS)
Zapart, Christopher A.
2009-04-01
The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.
Meta-analysis inside and outside particle physics: two traditions that should converge?
Baker, Rose D; Jackson, Dan
2013-06-01
The use of meta-analysis in medicine and epidemiology really took off in the 1970s. However, in high-energy physics, the Particle Data Group has been carrying out meta-analyses of measurements of particle masses and other properties since 1957. Curiously, there has been virtually no interaction between those working inside and outside particle physics. In this paper, we use statistical models to study two major differences in practice. The first is the usefulness of systematic errors, which physicists are now beginning to quote in addition to statistical errors. The second is whether it is better to treat heterogeneity by scaling up errors as do the Particle Data Group or by adding a random effect as does the rest of the community. Besides fitting models, we derive and use an exact test of the error-scaling hypothesis. We also discuss the other methodological differences between the two streams of meta-analysis. Our conclusion is that systematic errors are not currently very useful and that the conventional random effects model, as routinely used in meta-analysis, has a useful role to play in particle physics. The moral we draw for statisticians is that we should be more willing to explore 'grassroots' areas of statistical application, so that good statistical practice can flow both from and back to the statistical mainstream. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.
Micro-foundations for macroeconomics: New set-up based on statistical physics
NASA Astrophysics Data System (ADS)
Yoshikawa, Hiroshi
2016-12-01
Modern macroeconomics is built on "micro foundations." Namely, optimization of micro agent such as consumer and firm is explicitly analyzed in model. Toward this goal, standard model presumes "the representative" consumer/firm, and analyzes its behavior in detail. However, the macroeconomy consists of 107 consumers and 106 firms. For the purpose of analyzing such macro system, it is meaningless to pursue the micro behavior in detail. In this respect, there is no essential difference between economics and physics. The method of statistical physics can be usefully applied to the macroeconomy, and provides Keynesian economics with correct micro-foundations.
Dark energy models through nonextensive Tsallis' statistics
NASA Astrophysics Data System (ADS)
Barboza, Edésio M.; Nunes, Rafael da C.; Abreu, Everton M. C.; Ananias Neto, Jorge
2015-10-01
The accelerated expansion of the Universe is one of the greatest challenges of modern physics. One candidate to explain this phenomenon is a new field called dark energy. In this work we have used the Tsallis nonextensive statistical formulation of the Friedmann equation to explore the Barboza-Alcaniz and Chevalier-Polarski-Linder parametric dark energy models and the Wang-Meng and Dalal vacuum decay models. After that, we have discussed the observational tests and the constraints concerning the Tsallis nonextensive parameter. Finally, we have described the dark energy physics through the role of the q-parameter.
The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach
NASA Astrophysics Data System (ADS)
Sari, S. Y.; Afrizon, R.
2018-04-01
Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.
WE-A-201-01: Memorial Introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, C.
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
Focus on Statistical Physics Modeling in Economics and Finance
NASA Astrophysics Data System (ADS)
Mantegna, Rosario N.; Kertész, János
2011-02-01
This focus issue presents a collection of papers on recent results in statistical physics modeling in economics and finance, commonly known as econophysics. We touch briefly on the history of this relatively new multi-disciplinary field, summarize the motivations behind its emergence and try to characterize its specific features. We point out some research aspects that must be improved and briefly discuss the topics the research field is moving toward. Finally, we give a short account of the papers collected in this issue.
The distribution of density in supersonic turbulence
NASA Astrophysics Data System (ADS)
Squire, Jonathan; Hopkins, Philip F.
2017-11-01
We propose a model for the statistics of the mass density in supersonic turbulence, which plays a crucial role in star formation and the physics of the interstellar medium (ISM). The model is derived by considering the density to be arranged as a collection of strong shocks of width ˜ M^{-2}, where M is the turbulent Mach number. With two physically motivated parameters, the model predicts all density statistics for M>1 turbulence: the density probability distribution and its intermittency (deviation from lognormality), the density variance-Mach number relation, power spectra and structure functions. For the proposed model parameters, reasonable agreement is seen between model predictions and numerical simulations, albeit within the large uncertainties associated with current simulation results. More generally, the model could provide a useful framework for more detailed analysis of future simulations and observational data. Due to the simple physical motivations for the model in terms of shocks, it is straightforward to generalize to more complex physical processes, which will be helpful in future more detailed applications to the ISM. We see good qualitative agreement between such extensions and recent simulations of non-isothermal turbulence.
Incorporating signal-dependent noise for hyperspectral target detection
NASA Astrophysics Data System (ADS)
Morman, Christopher J.; Meola, Joseph
2015-05-01
The majority of hyperspectral target detection algorithms are developed from statistical data models employing stationary background statistics or white Gaussian noise models. Stationary background models are inaccurate as a result of two separate physical processes. First, varying background classes often exist in the imagery that possess different clutter statistics. Many algorithms can account for this variability through the use of subspaces or clustering techniques. The second physical process, which is often ignored, is a signal-dependent sensor noise term. For photon counting sensors that are often used in hyperspectral imaging systems, sensor noise increases as the measured signal level increases as a result of Poisson random processes. This work investigates the impact of this sensor noise on target detection performance. A linear noise model is developed describing sensor noise variance as a linear function of signal level. The linear noise model is then incorporated for detection of targets using data collected at Wright Patterson Air Force Base.
Generalized two-dimensional chiral QED: Anomaly and exotic statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saradzhev, F.M.
1997-07-01
We study the influence of the anomaly on the physical quantum picture of the generalized chiral Schwinger model defined on S{sup 1}. We show that the anomaly (i) results in the background linearly rising electric field and (ii) makes the spectrum of the physical Hamiltonian nonrelativistic without a massive boson. The physical matter fields acquire exotic statistics. We construct explicitly the algebra of the Poincar{acute e} generators and show that it differs from the Poincar{acute e} one. We exhibit the role of the vacuum Berry phase in the failure of the Poincar{acute e} algebra to close. We prove that, inmore » spite of the background electric field, such phenomenon as the total screening of external charges characteristic for the standard Schwinger model takes place in the generalized chiral Schwinger model, too. {copyright} {ital 1997} {ital The American Physical Society}« less
NASA Astrophysics Data System (ADS)
Suhir, E.
2014-05-01
The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.
NASA Astrophysics Data System (ADS)
Qi, Di
Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are applied in the training phase for calibrating model errors to achieve optimal imperfect model parameters; and total statistical energy dynamics are introduced to improve the model sensitivity in the prediction phase especially when strong external perturbations are exerted. The validity of reduced-order models for predicting statistical responses and intermittency is demonstrated on a series of instructive models with increasing complexity, including the stochastic triad model, the Lorenz '96 model, and models for barotropic and baroclinic turbulence. The skillful low-order modeling methods developed here should also be useful for other applications such as efficient algorithms for data assimilation.
Precision Cosmology: The First Half Million Years
NASA Astrophysics Data System (ADS)
Jones, Bernard J. T.
2017-06-01
Cosmology seeks to characterise our Universe in terms of models based on well-understood and tested physics. Today we know our Universe with a precision that once would have been unthinkable. This book develops the entire mathematical, physical and statistical framework within which this has been achieved. It tells the story of how we arrive at our profound conclusions, starting from the early twentieth century and following developments up to the latest data analysis of big astronomical datasets. It provides an enlightening description of the mathematical, physical and statistical basis for understanding and interpreting the results of key space- and ground-based data. Subjects covered include general relativity, cosmological models, the inhomogeneous Universe, physics of the cosmic background radiation, and methods and results of data analysis. Extensive online supplementary notes, exercises, teaching materials, and exercises in Python make this the perfect companion for researchers, teachers and students in physics, mathematics, and astrophysics.
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
Christiana, Richard W; Davis, Marsha; Wilson, Mark G; McCarty, Frances A; Green, Gary T
2014-12-01
Young adolescents who have little interest in participating in competitive team sports are at an increased risk for physical inactivity. Noncompetitive outdoor physical activity can provide young adolescents with increased opportunities to participate in physical activities that appeal to them and have positive health effects. The purpose of this study was to examine factors related to rural young adolescents' participation in noncompetitive outdoor physical activity to inform intervention design. Young adolescents aged 10 to 14 years old (N = 1,032) from 1 rural county completed a self-administered questionnaire assessing constructs from self-determination theory (SDT) and the theory of planned behavior (TPB) related to noncompetitive outdoor physical activity. Structural equation modeling was used to examine an integrated conceptual model of hypothesized relationships among constructs. The hypothesized conceptual model provided a good fit to the data with greater perceptions of autonomy support and self-determined motivation having statistically significant positive indirect effects on participation in noncompetitive outdoor physical activity mediated by the constructs of the TPB. All direct paths in the model were statistically significant; however, the direct effect of attitudes on intention was weak (.08) and self-determined motivation had no indirect effect on intention through attitudes (.03). Constructs of SDT and TPB should be accounted for by interventions targeting noncompetitive outdoor physical activity among young adolescents. More research is needed to determine young adolescents' preferences for noncompetitive and competitive physical activity and the potential influence that noncompetitive outdoor physical activity may have on total daily physical activity.
Statistical Modeling for Radiation Hardness Assurance
NASA Technical Reports Server (NTRS)
Ladbury, Raymond L.
2014-01-01
We cover the models and statistics associated with single event effects (and total ionizing dose), why we need them, and how to use them: What models are used, what errors exist in real test data, and what the model allows us to say about the DUT will be discussed. In addition, how to use other sources of data such as historical, heritage, and similar part and how to apply experience, physics, and expert opinion to the analysis will be covered. Also included will be concepts of Bayesian statistics, data fitting, and bounding rates.
Modelling 1-minute directional observations of the global irradiance.
NASA Astrophysics Data System (ADS)
Thejll, Peter; Pagh Nielsen, Kristian; Andersen, Elsa; Furbo, Simon
2016-04-01
Direct and diffuse irradiances from the sky has been collected at 1-minute intervals for about a year from the experimental station at the Technical University of Denmark for the IEA project "Solar Resource Assessment and Forecasting". These data were gathered by pyrheliometers tracking the Sun, as well as with apertured pyranometers gathering 1/8th and 1/16th of the light from the sky in 45 degree azimuthal ranges pointed around the compass. The data are gathered in order to develop detailed models of the potentially available solar energy and its variations at high temporal resolution in order to gain a more detailed understanding of the solar resource. This is important for a better understanding of the sub-grid scale cloud variation that cannot be resolved with climate and weather models. It is also important for optimizing the operation of active solar energy systems such as photovoltaic plants and thermal solar collector arrays, and for passive solar energy and lighting to buildings. We present regression-based modelling of the observed data, and focus, here, on the statistical properties of the model fits. Using models based on the one hand on what is found in the literature and on physical expectations, and on the other hand on purely statistical models, we find solutions that can explain up to 90% of the variance in global radiation. The models leaning on physical insights include terms for the direct solar radiation, a term for the circum-solar radiation, a diffuse term and a term for the horizon brightening/darkening. The purely statistical model is found using data- and formula-validation approaches picking model expressions from a general catalogue of possible formulae. The method allows nesting of expressions, and the results found are dependent on and heavily constrained by the cross-validation carried out on statistically independent testing and training data-sets. Slightly better fits -- in terms of variance explained -- is found using the purely statistical fitting/searching approach. We describe the methods applied, results found, and discuss the different potentials of the physics- and statistics-only based model-searches.
Statistical physics of vaccination
NASA Astrophysics Data System (ADS)
Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei
2016-12-01
Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.
[Factors associated with physical activity among Chinese immigrant women].
Cho, Sung-Hye; Lee, Hyeonkyeong
2013-12-01
This study was done to assess the level of physical activity among Chinese immigrant women and to determine the relationships of physical activity with individual characteristics and behavior-specific cognition. A cross-sectional descriptive study was conducted with 161 Chinese immigrant women living in Busan. A health promotion model of physical activity adapted from Pender's Health Promotion Model was used. Self-administered questionnaires were used to collect data during the period from September 25 to November 20, 2012. Using SPSS 18.0 program, descriptive statistics, t-test, analysis of variance, correlation analysis, and multiple regression analysis were done. The average level of physical activity of the Chinese immigrant women was 1,050.06 ± 686.47 MET-min/week and the minimum activity among types of physical activity was most dominant (59.6%). As a result of multiple regression analysis, it was confirmed that self-efficacy and acculturation were statistically significant variables in the model (p<.001), with an explanatory power of 23.7%. The results indicate that the development and application of intervention strategies to increase acculturation and self-efficacy for immigrant women will aid in increasing the physical activity in Chinese immigrant women.
Kinetic exchange models: From molecular physics to social science
NASA Astrophysics Data System (ADS)
Patriarca, Marco; Chakraborti, Anirban
2013-08-01
We discuss several multi-agent models that have their origin in the kinetic exchange theory of statistical mechanics and have been recently applied to a variety of problems in the social sciences. This class of models can be easily adapted for simulations in areas other than physics, such as the modeling of income and wealth distributions in economics and opinion dynamics in sociology.
A statistical physics viewpoint on the dynamics of the bouncing ball
NASA Astrophysics Data System (ADS)
Chastaing, Jean-Yonnel; Géminard, Jean-Christophe; Bertin, Eric
2016-06-01
We compute, in a statistical physics perspective, the dynamics of a bouncing ball maintained in a chaotic regime thanks to collisions with a plate experiencing an aperiodic vibration. We analyze in details the energy exchanges between the bead and the vibrating plate, and show that the coupling between the bead and the plate can be modeled in terms of both a dissipative process and an injection mechanism by an energy reservoir. An analysis of the injection statistics in terms of fluctuation relation is also provided.
Determinants of physical activity in middle-aged woman in Isfahan using the health belief model.
Hosseini, Habibollah; Moradi, Razieh; Kazemi, Ashraf; Shahshahani, Maryam Sadat
2017-01-01
Nowadays with respect to the automation of the lifestyle, immobility statistics in middle-aged women has increased and they are at risk for complications of immobility. One of the models used to identify factors associated with physical activity is Health Belief Model utilized in different age and different cultural backgrounds and different results have been obtained from those studies. The purpose of this study was to investigate the factors affecting on physical activity in middle-aged women using Health Belief Model. This descriptive-correlation study was conducted on 224 middle-aged women referring to health centers in Isfahan. Health Belief Model structures including perceived susceptibility and severity, perceived barriers and benefits, and self-efficacy were measured by questionnaire and physical activity was assessed using the international physical activity questionnaire. Collected data were analyzed using descriptive statistics and Pearson correlation coefficient test and regression analysis. There wasn't significant correlation between perceived susceptibility ( P = 0.263, r = 0.075) and perceived severity with physical activity duration ( P = 0.127, r = 0.058) but there was positive and weak correlation between physical activity duration with perceived benefits ( P = 0.001 and r = 0.26) and perceived self-efficacy ( P = 0.001, r = 0.54) and had weak and inverse correlation with perceived barriers ( P = 0.001, r = -0.25). Regression analysis also showed that from among all the Health Belief Model structures just self-efficacy structure has influenced on behavior independently and other structures are affected by it. The obtained results implied on a correlation between benefits, barriers and perceived self-efficacy with and moderate physical activity. Therefore it is necessary to develop appropriate educational programs with emphasis on structures of Health Belief Model that has the maximum impact on physical activity in middle-aged women.
NASA Astrophysics Data System (ADS)
Ben Torkia, Yosra; Ben Yahia, Manel; Khalfaoui, Mohamed; Al-Muhtaseb, Shaheen A.; Ben Lamine, Abdelmottaleb
2014-01-01
The adsorption energy distribution (AED) function of a commercial activated carbon (BDH-activated carbon) was investigated. For this purpose, the integral equation is derived by using a purely analytical statistical physics treatment. The description of the heterogeneity of the adsorbent is significantly clarified by defining the parameter N(E). This parameter represents the energetic density of the spatial density of the effectively occupied sites. To solve the integral equation, a numerical method was used based on an adequate algorithm. The Langmuir model was adopted as a local adsorption isotherm. This model is developed by using the grand canonical ensemble, which allows defining the physico-chemical parameters involved in the adsorption process. The AED function is estimated by a normal Gaussian function. This method is applied to the adsorption isotherms of nitrogen, methane and ethane at different temperatures. The development of the AED using a statistical physics treatment provides an explanation of the gas molecules behaviour during the adsorption process and gives new physical interpretations at microscopic levels.
Association Between Perceived Physical Activity and Cognitive Function in Older Adults.
Loprinzi, Paul D; Frith, Emily
2018-01-01
There is irrefutable evidence that regular participation in physical activity is favorably associated with numerous positive health outcomes, including cognitive function. Emerging work suggests that perceived physical activity, independent of actual physical activity behavior, is inversely associated with mortality risk. In this study, we evaluate whether perceived physical activity, independent of actual physical activity, is associated with cognitive function, a robust indicator of mortality risk. Data from the cross-sectional 1999-2002 National Health and Nutrition Examination Survey were employed ( N = 2352; 60+ years of age). Actual physical activity was assessed via a validated survey. Perceived physical activity was assessed using the following question: "Compared with others of the same age, would you say that you are: more active, less active, or about the same?" Cognitive function was assessed from the Digit Symbol Substitution Test. When examined in separate models, both actual and perceived physical activity were positively and statistically significantly associated with cognitive function. However, when considered in the same model, actual physical activity was no longer statistically significantly associated with cognitive function, but perceived physical activity was. Perceived physical activity, independent of actual physical activity, is independently associated with cognitive function. If these findings are replicated, future work should consider evaluating perceived physical activity when examining the effects of actual physical activity behavior on cognitive function.
a Statistical Theory of the Epilepsies.
NASA Astrophysics Data System (ADS)
Thomas, Kuryan
1988-12-01
A new physical and mathematical model for the epilepsies is proposed, based on the theory of bond percolation on finite lattices. Within this model, the onset of seizures in the brain is identified with the appearance of spanning clusters of neurons engaged in the spurious and uncontrollable electrical activity characteristic of seizures. It is proposed that the fraction of excitatory to inhibitory synapses can be identified with a bond probability, and that the bond probability is a randomly varying quantity displaying Gaussian statistics. The consequences of the proposed model to the treatment of the epilepsies is explored. The nature of the data on the epilepsies which can be acquired in a clinical setting is described. It is shown that such data can be analyzed to provide preliminary support for the bond percolation hypothesis, and to quantify the efficacy of anti-epileptic drugs in a treatment program. The results of a battery of statistical tests on seizure distributions are discussed. The physical theory of the electroencephalogram (EEG) is described, and extant models of the electrical activity measured by the EEG are discussed, with an emphasis on their physical behavior. A proposal is made to explain the difference between the power spectra of electrical activity measured with cranial probes and with the EEG. Statistical tests on the characteristic EEG manifestations of epileptic activity are conducted, and their results described. Computer simulations of a correlated bond percolating system are constructed. It is shown that the statistical properties of the results of such a simulation are strongly suggestive of the statistical properties of clinical data. The study finds no contradictions between the predictions of the bond percolation model and the observed properties of the available data. Suggestions are made for further research and for techniques based on the proposed model which may be used for tuning the effects of anti -epileptic drugs.
RooStatsCms: A tool for analysis modelling, combination and statistical studies
NASA Astrophysics Data System (ADS)
Piparo, D.; Schott, G.; Quast, G.
2010-04-01
RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.
Stochastic Spatial Models in Ecology: A Statistical Physics Approach
NASA Astrophysics Data System (ADS)
Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.
2018-07-01
Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.
Stochastic Spatial Models in Ecology: A Statistical Physics Approach
NASA Astrophysics Data System (ADS)
Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.
2017-11-01
Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.
Applications of statistical physics to technology price evolution
NASA Astrophysics Data System (ADS)
McNerney, James
Understanding how changing technology affects the prices of goods is a problem with both rich phenomenology and important policy consequences. Using methods from statistical physics, I model technology-driven price evolution. First, I examine a model for the price evolution of individual technologies. The price of a good often follows a power law equation when plotted against its cumulative production. This observation turns out to have significant consequences for technology policy aimed at mitigating climate change, where technologies are needed that achieve low carbon emissions at low cost. However, no theory adequately explains why technology prices follow power laws. To understand this behavior, I simplify an existing model that treats technologies as machines composed of interacting components. I find that the power law exponent of the price trajectory is inversely related to the number of interactions per component. I extend the model to allow for more realistic component interactions and make a testable prediction. Next, I conduct a case-study on the cost evolution of coal-fired electricity. I derive the cost in terms of various physical and economic components. The results suggest that commodities and technologies fall into distinct classes of price models, with commodities following martingales, and technologies following exponentials in time or power laws in cumulative production. I then examine the network of money flows between industries. This work is a precursor to studying the simultaneous evolution of multiple technologies. Economies resemble large machines, with different industries acting as interacting components with specialized functions. To begin studying the structure of these machines, I examine 20 economies with an emphasis on finding common features to serve as targets for statistical physics models. I find they share the same money flow and industry size distributions. I apply methods from statistical physics to show that industries cluster the same way according to industry type. Finally, I use these industry money flows to model the price evolution of many goods simultaneously, where network effects become important. I derive a prediction for which goods tend to improve most rapidly. The fastest-improving goods are those with the highest mean path lengths in the money flow network.
Orbit-Attitude Changes of Objects in Near Earth Space Induced by Natural Charging
2017-05-02
depends upon Earth’s magnetosphere. Typically, magneto-sphere models can be grouped under two classes: statistical and physics -based. The Physics ...models were primarily physics -based due to unavailability of sufficient space-data, but over the last three decades, with the availability of huge...Attitude Determination and Control,” Astrophysics and Space Sci- ence Library, Vol. 73, D. Reidel Publishing Company, London, 1978 [17] Fairfield
2016-05-11
new physically -based prediction models for all-weather path attenuation estimation at Ka, V and W band from multi- channel microwave radiometric data...of new physically -based prediction models for all-weather path attenuation estimation at Ka, V and W band from multi- channel microwave radiometric...the medium behavior at these frequency bands from both a physical and a statistical point of view (e.g., [5]-[7]). However, these campaigns are
NASA Astrophysics Data System (ADS)
Karpushin, P. A.; Popov, Yu B.; Popova, A. I.; Popova, K. Yu; Krasnenko, N. P.; Lavrinenko, A. V.
2017-11-01
In this paper, the probabilities of faultless operation of aerologic stations are analyzed, the hypothesis of normality of the empirical data required for using the Kalman filter algorithms is tested, and the spatial correlation functions of distributions of meteorological parameters are determined. The results of a statistical analysis of two-term (0, 12 GMT) radiosonde observations of the temperature and wind velocity components at some preset altitude ranges in the troposphere in 2001-2016 are presented. These data can be used in mathematical modeling of physical processes in the atmosphere.
Stability of knotted vortices in wave chaos
NASA Astrophysics Data System (ADS)
Taylor, Alexander; Dennis, Mark
Large scale tangles of disordered filaments occur in many diverse physical systems, from turbulent superfluids to optical volume speckle to liquid crystal phases. They can exhibit particular large scale random statistics despite very different local physics. We have previously used the topological statistics of knotting and linking to characterise the large scale tangling, using the vortices of three-dimensional wave chaos as a universal model system whose physical lengthscales are set only by the wavelength. Unlike geometrical quantities, the statistics of knotting depend strongly on the physical system and boundary conditions. Although knotting patterns characterise different systems, the topology of vortices is highly unstable to perturbation, under which they may reconnect with one another. In systems of constructed knots, these reconnections generally rapidly destroy the knot, but for vortex tangles the topological statistics must be stable. Using large scale simulations of chaotic eigenfunctions, we numerically investigate the prevalence and impact of reconnection events, and their effect on the topology of the tangle.
NASA Astrophysics Data System (ADS)
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Measured time-series of both precipitation and runoff are known to exhibit highly non-trivial statistical properties. For making reliable probabilistic predictions in hydrology, it is therefore desirable to have stochastic models with output distributions that share these properties. When parameters of such models have to be inferred from data, we also need to quantify the associated parametric uncertainty. For non-trivial stochastic models, however, this latter step is typically very demanding, both conceptually and numerically, and always never done in hydrology. Here, we demonstrate that methods developed in statistical physics make a large class of stochastic differential equation (SDE) models amenable to a full-fledged Bayesian parameter inference. For concreteness we demonstrate these methods by means of a simple yet non-trivial toy SDE model. We consider a natural catchment that can be described by a linear reservoir, at the scale of observation. All the neglected processes are assumed to happen at much shorter time-scales and are therefore modeled with a Gaussian white noise term, the standard deviation of which is assumed to scale linearly with the system state (water volume in the catchment). Even for constant input, the outputs of this simple non-linear SDE model show a wealth of desirable statistical properties, such as fat-tailed distributions and long-range correlations. Standard algorithms for Bayesian inference fail, for models of this kind, because their likelihood functions are extremely high-dimensional intractable integrals over all possible model realizations. The use of Kalman filters is illegitimate due to the non-linearity of the model. Particle filters could be used but become increasingly inefficient with growing number of data points. Hamiltonian Monte Carlo algorithms allow us to translate this inference problem to the problem of simulating the dynamics of a statistical mechanics system and give us access to most sophisticated methods that have been developed in the statistical physics community over the last few decades. We demonstrate that such methods, along with automated differentiation algorithms, allow us to perform a full-fledged Bayesian inference, for a large class of SDE models, in a highly efficient and largely automatized manner. Furthermore, our algorithm is highly parallelizable. For our toy model, discretized with a few hundred points, a full Bayesian inference can be performed in a matter of seconds on a standard PC.
Statistical analysis of target acquisition sensor modeling experiments
NASA Astrophysics Data System (ADS)
Deaver, Dawne M.; Moyer, Steve
2015-05-01
The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.
SEPEM: A tool for statistical modeling the solar energetic particle environment
NASA Astrophysics Data System (ADS)
Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain
2015-07-01
Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.
Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki
2018-04-01
Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.
A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasqualini, Donatella
This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimatedmore » stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.« less
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
NASA Astrophysics Data System (ADS)
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
NASA Technical Reports Server (NTRS)
Xiang, Xuwu; Smith, Eric A.; Tripoli, Gregory J.
1992-01-01
A hybrid statistical-physical retrieval scheme is explored which combines a statistical approach with an approach based on the development of cloud-radiation models designed to simulate precipitating atmospheres. The algorithm employs the detailed microphysical information from a cloud model as input to a radiative transfer model which generates a cloud-radiation model database. Statistical procedures are then invoked to objectively generate an initial guess composite profile data set from the database. The retrieval algorithm has been tested for a tropical typhoon case using Special Sensor Microwave/Imager (SSM/I) data and has shown satisfactory results.
Maxwell's color statistics: from reduction of visible errors to reduction to invisible molecules.
Cat, Jordi
2014-12-01
This paper presents a cross-disciplinary and multi-disciplinary account of Maxwell's introduction of statistical models of molecules for the composition of gases. The account focuses on Maxwell's deployment of statistical models of data in his contemporaneous color researches as established in Cambridge mathematical physics, especially by Maxwell's seniors and mentors. The paper also argues that the cross-disciplinary, or cross-domain, transfer of resources from the natural and social sciences took place in both directions and relied on the complex intra-disciplinary, or intra-domain, dynamics of Maxwell's researches in natural sciences, in color theory, physical astronomy, electromagnetism and dynamical theory of gases, as well as involving a variety of types of communicating and mediating media, from material objects to concepts, techniques and institutions.
NASA Astrophysics Data System (ADS)
Batchelor, Murray T.; Wille, Luc T.
The Table of Contents for the book is as follows: * Preface * Modelling the Immune System - An Example of the Simulation of Complex Biological Systems * Brief Overview of Quantum Computation * Quantal Information in Statistical Physics * Modeling Economic Randomness: Statistical Mechanics of Market Phenomena * Essentially Singular Solutions of Feigenbaum- Type Functional Equations * Spatiotemporal Chaotic Dynamics in Coupled Map Lattices * Approach to Equilibrium of Chaotic Systems * From Level to Level in Brain and Behavior * Linear and Entropic Transformations of the Hydrophobic Free Energy Sequence Help Characterize a Novel Brain Polyprotein: CART's Protein * Dynamical Systems Response to Pulsed High-Frequency Fields * Bose-Einstein Condensates in the Light of Nonlinear Physics * Markov Superposition Expansion for the Entropy and Correlation Functions in Two and Three Dimensions * Calculation of Wave Center Deflection and Multifractal Analysis of Directed Waves Through the Study of su(1,1)Ferromagnets * Spectral Properties and Phases in Hierarchical Master Equations * Universality of the Distribution Functions of Random Matrix Theory * The Universal Chiral Partition Function for Exclusion Statistics * Continuous Space-Time Symmetries in a Lattice Field Theory * Quelques Cas Limites du Problème à N Corps Unidimensionnel * Integrable Models of Correlated Electrons * On the Riemann Surface of the Three-State Chiral Potts Model * Two Exactly Soluble Lattice Models in Three Dimensions * Competition of Ferromagnetic and Antiferromagnetic Order in the Spin-l/2 XXZ Chain at Finite Temperature * Extended Vertex Operator Algebras and Monomial Bases * Parity and Charge Conjugation Symmetries and S Matrix of the XXZ Chain * An Exactly Solvable Constrained XXZ Chain * Integrable Mixed Vertex Models Ftom the Braid-Monoid Algebra * From Yang-Baxter Equations to Dynamical Zeta Functions for Birational Tlansformations * Hexagonal Lattice Directed Site Animals * Direction in the Star-Triangle Relations * A Self-Avoiding Walk Through Exactly Solved Lattice Models in Statistical Mechanics
DOT National Transportation Integrated Search
1976-03-01
This introductory portion of a system science for tranportation planning, which is based on the statistical physics of ensembles, a foundations laid on how statistical mechanics, equilibrium thermodynamics, and near equilbrium thermodynamics can be u...
Vibroacoustic optimization using a statistical energy analysis model
NASA Astrophysics Data System (ADS)
Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia
2016-08-01
In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.
Metamodelling Messages Conveyed in Five Statistical Mechanical Textbooks from 1936 to 2001
ERIC Educational Resources Information Center
Niss, Martin
2009-01-01
Modelling is a significant aspect of doing physics and it is important how this activity is taught. This paper focuses on the explicit or implicit messages about modelling conveyed to the student in the treatments of phase transitions in statistical mechanics textbooks at beginning graduate level. Five textbooks from the 1930s to the present are…
Statistics and classification of the microwave zebra patterns associated with solar flares
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Baolin; Tan, Chengming; Zhang, Yin
2014-01-10
The microwave zebra pattern (ZP) is the most interesting, intriguing, and complex spectral structure frequently observed in solar flares. A comprehensive statistical study will certainly help us to understand the formation mechanism, which is not exactly clear now. This work presents a comprehensive statistical analysis of a big sample with 202 ZP events collected from observations at the Chinese Solar Broadband Radio Spectrometer at Huairou and the Ondŕejov Radiospectrograph in the Czech Republic at frequencies of 1.00-7.60 GHz from 2000 to 2013. After investigating the parameter properties of ZPs, such as the occurrence in flare phase, frequency range, polarization degree,more » duration, etc., we find that the variation of zebra stripe frequency separation with respect to frequency is the best indicator for a physical classification of ZPs. Microwave ZPs can be classified into three types: equidistant ZPs, variable-distant ZPs, and growing-distant ZPs, possibly corresponding to mechanisms of the Bernstein wave model, whistler wave model, and double plasma resonance model, respectively. This statistical classification may help us to clarify the controversies between the existing various theoretical models and understand the physical processes in the source regions.« less
Biological evolution and statistical physics
NASA Astrophysics Data System (ADS)
Drossel, Barbara
2001-03-01
This review is an introduction to theoretical models and mathematical calculations for biological evolution, aimed at physicists. The methods in the field are naturally very similar to those used in statistical physics, although the majority of publications have appeared in biology journals. The review has three parts, which can be read independently. The first part deals with evolution in fitness landscapes and includes Fisher's theorem, adaptive walks, quasispecies models, effects of finite population sizes, and neutral evolution. The second part studies models of coevolution, including evolutionary game theory, kin selection, group selection, sexual selection, speciation, and coevolution of hosts and parasites. The third part discusses models for networks of interacting species and their extinction avalanches. Throughout the review, attention is paid to giving the necessary biological information, and to pointing out the assumptions underlying the models, and their limits of validity.
New approach in the quantum statistical parton distribution
NASA Astrophysics Data System (ADS)
Sohaily, Sozha; Vaziri (Khamedi), Mohammad
2017-12-01
An attempt to find simple parton distribution functions (PDFs) based on quantum statistical approach is presented. The PDFs described by the statistical model have very interesting physical properties which help to understand the structure of partons. The longitudinal portion of distribution functions are given by applying the maximum entropy principle. An interesting and simple approach to determine the statistical variables exactly without fitting and fixing parameters is surveyed. Analytic expressions of the x-dependent PDFs are obtained in the whole x region [0, 1], and the computed distributions are consistent with the experimental observations. The agreement with experimental data, gives a robust confirm of our simple presented statistical model.
Active lifestyles in older adults: an integrated predictive model of physical activity and exercise
Galli, Federica; Chirico, Andrea; Mallia, Luca; Girelli, Laura; De Laurentiis, Michelino; Lucidi, Fabio; Giordano, Antonio; Botti, Gerardo
2018-01-01
Physical activity and exercise have been identified as behaviors to preserve physical and mental health in older adults. The aim of the present study was to test the Integrated Behavior Change model in exercise and physical activity behaviors. The study evaluated two different samples of older adults: the first engaged in exercise class, the second doing spontaneous physical activity. The key analyses relied on Variance-Based Structural Modeling, which were performed by means of WARP PLS 6.0 statistical software. The analyses estimated the Integrated Behavior Change model in predicting exercise and physical activity, in a longitudinal design across two months of assessment. The tested models exhibited a good fit with the observed data derived from the model focusing on exercise, as well as with those derived from the model focusing on physical activity. Results showed, also, some effects and relations specific to each behavioral context. Results may form a starting point for future experimental and intervention research. PMID:29875997
The Quantum and Fluid Mechanics of Global Warming
NASA Astrophysics Data System (ADS)
Marston, Brad
2008-03-01
Quantum physics and fluid mechanics are the foundation of any understanding of the Earth's climate. In this talk I invoke three well-known aspects of quantum mechanics to explore what will happen as the concentrations of greenhouse gases such as carbon dioxide continue to increase. Fluid dynamical models of the Earth's atmosphere, demonstrated here in live simulations, yield further insight into past, present, and future climates. Statistics of geophysical flows can, however, be ascertained directly without recourse to numerical simulation, using concepts borrowed from nonequilibrium statistical mechanicsootnotetextJ. B. Marston, E. Conover, and Tapio Schneider, ``Statistics of an Unstable Barotropic Jet from a Cumulant Expansion,'' arXiv:0705.0011, J. Atmos. Sci. (in press).. I discuss several other ways that theoretical physics may be able to contribute to a deeper understanding of climate changeootnotetextJ. Carlson, J. Harte, G. Falkovich, J. B. Marston, and R. Pierrehumbert, ``Physics of Climate Change'' 2008 Program of the Kavli Institute for Theoretical Physics..
Reentry survivability modeling
NASA Astrophysics Data System (ADS)
Fudge, Michael L.; Maher, Robert L.
1997-10-01
Statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by re-entering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was recently demonstrated in dramatic fashion by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This paper examines reentry survivability estimation methodology, including the specific methodology used by Caiman Sciences' 'Survive' model. Comparison between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and Survive estimates are presented for selected upper stage or spacecraft components and a Delta launch vehicle second stage.
Physical Uncertainty Bounds (PUB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less
Loprinzi, Paul D; Walker, Jerome F; Kane, Christy; Cardinal, Bradley J
2014-01-01
Research demonstrates that nicotine dependence and depression are associated and that physical activity is effective in reducing depression symptoms. However, our understanding of the potential beneficial effects of physical activity on depression in current smokers is more limited. The purpose of this study was to examine whether physical activity moderates the association between nicotine dependence and depression in U.S. smokers. Cross-sectional. National Health and Nutrition Examination Survey 2005-2006. Four hundred forty-one current adult smokers. Participants wore an accelerometer for at least 4 days and completed questionnaires to assess nicotine dependence and depression. Effect modification and statistical interaction models were used. Both models were significant. With regard to the statistical interaction model, and after controlling for age, gender, race/ethnicity, education, comorbidity index, homocysteine, cotinine, total cholesterol, sedentary behavior, and vitamins C, D, and E, objectively measured physical activity moderated the association between nicotine dependence and depression (interaction variable: odds ratio = 3.43; 95% confidence interval: 1.02-11.51; p = .04). In this national sample of current smokers, physical activity moderated the association between nicotine dependence and depression. These results suggest that those individuals with nicotine dependence and who are less physically active are more likely to be depressed than what would be expected on the basis of the individual effects of nicotine and physical inactivity separately.
NASA Technical Reports Server (NTRS)
Rastaetter, L.; Kuznetsova, M.; Hesse, M.; Pulkkinen, A.; Glocer, A.; Yu, Y.; Meng, X.; Raeder, J.; Wiltberger, M.; Welling, D.;
2011-01-01
In this paper the metrics-based results of the Dst part of the 2008-2009 GEM Metrics Challenge are reported. The Metrics Challenge asked modelers to submit results for 4 geomagnetic storm events and 5 different types of observations that can be modeled by statistical or climatological or physics-based (e.g. MHD) models of the magnetosphere-ionosphere system. We present the results of over 25 model settings that were run at the Community Coordinated Modeling Center (CCMC) and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations we use comparisons of one-hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of one-minute model data with the one-minute Dst index calculated by the United States Geologic Survey (USGS).
Dorfman, Kevin D
2018-02-01
The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.
Modelling the hydraulic conductivity of porous media using physical-statistical model
NASA Astrophysics Data System (ADS)
Usowicz, B.; Usowicz, L. B.; Lipiec, J.
2009-04-01
Soils and other porous media can be represented by a pattern (net) of more or less cylindrically interconnected channels. The capillary radius, r can represent an elementary capillary formed in between soil particles in one case, and in another case it can represent a mean hydrodynamic radius. When we view a porous medium as a net of interconnected capillaries, we can apply a statistical approach for the description of the liquid or gas flow. A soil phase is included in the porous medium and its configuration is decisive for pore distribution in this medium and hence, it conditions the course of the water retention curve of this medium. In this work method of estimating hydraulic conductivity of porous media based on physical-statistical model proposed by B. Usowicz is presented. The physical-statistical model considers the pore space as the capillary net. The net of capillary connections is represented by parallel and serial connections of hydraulic resistors in the layer and between the layers, respectively. The polynomial distribution was used in this model to determine probability of the occurrence of a given capillary configuration. The model was calibrated using measured water retention curve and two values of hydraulic conductivity saturated and unsaturated and model parameters were determined. The model was used for predicting hydraulic conductivity as a function of soil water content K(theta). The model was validated by comparing the measured and predicted K data for various soils and other porous media (e.g. sandstone). A good agreement between measured and predicted data was reasonable as indicated by values R2 (>0.9). It was also confirmed that the random variables used for the calculations and model parameters were chosen and selected correctly. The study was funded in part by the Polish Ministry of Science and Higher Education by Grant No. N305 046 31/1707).
NASA Astrophysics Data System (ADS)
Adams, T.; Batra, P.; Bugel, L.; Camilleri, L.; Conrad, J. M.; de Gouvêa, A.; Fisher, P. H.; Formaggio, J. A.; Jenkins, J.; Karagiorgi, G.; Kobilarcik, T. R.; Kopp, S.; Kyle, G.; Loinaz, W. A.; Mason, D. A.; Milner, R.; Moore, R.; Morfín, J. G.; Nakamura, M.; Naples, D.; Nienaber, P.; Olness, F. I.; Owens, J. F.; Pate, S. F.; Pronin, A.; Seligman, W. G.; Shaevitz, M. H.; Schellman, H.; Schienbein, I.; Syphers, M. J.; Tait, T. M. P.; Takeuchi, T.; Tan, C. Y.; van de Water, R. G.; Yamamoto, R. K.; Yu, J. Y.
We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDF's). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parametrized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of "Beyond the Standard Model" physics.
Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Johan, E-mail: anderson.johan@gmail.com; Halpern, Federico D.; Ricci, Paolo
The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis ofmore » the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.« less
A Framework to Learn Physics from Atomically Resolved Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vlcek, L.; Maksov, A.; Pan, M.
Here, we present a generalized framework for physics extraction, i.e., knowledge, from atomically resolved images, and show its utility by applying it to a model system of segregation of chalcogen atoms in an FeSe 0.45Te 0.55 superconductor system. We emphasize that the framework can be used for any imaging data for which a generative physical model exists. Consider that a generative physical model can produce a very large number of configurations, not all of which are observable. By applying a microscope function to a sub-set of this generated data, we form a simulated dataset on which statistics can be computed.
Classification without labels: learning from mixed samples in high energy physics
NASA Astrophysics Data System (ADS)
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
2017-10-01
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimal classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.
Classification without labels: learning from mixed samples in high energy physics
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
2017-10-25
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less
Classification without labels: learning from mixed samples in high energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less
Trends and associated uncertainty in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, A. N.; Moyer, E. J.; Stein, M.
2016-12-01
Physical models suggest that the Earth's mean temperature warms in response to changing CO2 concentrations (and hence increased radiative forcing); given physical uncertainties in this relationship, the historical temperature record is a source of empirical information about global warming. A persistent thread in many analyses of the historical temperature record, however, is the reliance on methods that appear to deemphasize both physical and statistical assumptions. Examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for natural variability in nonparametric rather than parametric ways. We show here that methods that deemphasize assumptions can limit the scope of analysis and can lead to misleading inferences, particularly in the setting considered where the data record is relatively short and the scale of temporal correlation is relatively long. A proposed model that is simple but physically informed provides a more reliable estimate of trends and allows a broader array of questions to be addressed. In accounting for uncertainty, we also illustrate how parametric statistical models that are attuned to the important characteristics of natural variability can be more reliable than ostensibly more flexible approaches.
NASA Astrophysics Data System (ADS)
Usowicz, Boguslaw; Marczewski, Wojciech; Usowicz, Jerzy B.; Łukowski, Mateusz; Lipiec, Jerzy; Stankiewicz, Krystyna
2013-04-01
Radiometric observations with SMOS rely on the Radiation Transfer Equations (RTE) determining the Brightness Temperature (BT) in two linear polarization components (H, V) satisfying Fresnel principle of propagation in horizontally layered target media on the ground. RTE involve variables which bound the equations expressed in Electro-Magnetic (EM) terms of the intensity BT to the physical reality expressed by non-EM variables (Soil Moisture (SM), vegetation indexes, fractional coverage with many different properties, and the boundary conditions like optical thickness, layer definitions, roughness, etc.) bridging the EM domain to other physical aspects by means of the so called tau-omega methods. This method enables joining variety of different valuable models, including specific empirical estimation of physical properties in relation to the volumetric water content. The equations of RTE are in fact expressed by propagation, reflection and losses or attenuation existing on a considered propagation path. The electromagnetic propagation is expressed in the propagation constant. For target media on the ground the dielectric constant is a decisive part for effects of propagation. Therefore, despite of many various physical parameters involved, one must effectively and dominantly rely on the dielectric constant meant as a complex variable. The real part of the dielectric constant represents effect of apparent shortening the propagation path and the refraction, while the imaginary part is responsible for the attenuation or losses. This work engages statistical-physical modeling of soil properties considering the media as a mixture of solid grains, and gas or liquid filling of pores and contact bridges between compounds treated statistically. The method of this modeling provides an opportunity of characterizing the porosity by general statistical means, and is applicable to various physical properties (thermal, electrical conductivity and dielectric properties) which depend on composition of compounds. The method was developed beyond the SMOS method, but they meet just in RTE, at the dielectric constant. The dielectric constant is observed or measured (retrieved) by SMOS, regardless other properties like the soil porosity and without a direct relation to thermal properties of soils. Relations between thermal properties of soil to the water content are very consistent. Therefore, we took a concept of introducing effects of the soil porosity, and thermal properties of soils into the representation of the dielectric constant in complex measures, and thus gaining new abilities for capturing effects of the porosity by the method of SMOS observations. Currently we are able presenting few effects of relations between thermal properties and the soil moisture content, on examples from wetlands Biebrza and Polesie in Poland, and only search for correlations between SM from SMOS to the moisture content known from the ground. The correlations are poor for SMOS L2 data processed with the version of retrievals using the model of Dobson (501), but we expect more correlation for the version using the model of Mironov (551). If the supposition is confirmed, then we may gain encouragement to employing the statistical-physical modeling of the dielectric constant and thermal properties for the purposes of using this model in RTE and tau-omega method. Treating the soil porosity for a target of research directly is not enough strongly motivated like the use of effects on SM observable in SMOS.
NASA Astrophysics Data System (ADS)
Beach, Shaun E.; Semkow, Thomas M.; Remling, David J.; Bradt, Clayton J.
2017-07-01
We have developed accessible methods to demonstrate fundamental statistics in several phenomena, in the context of teaching electronic signal processing in a physics-based college-level curriculum. A relationship between the exponential time-interval distribution and Poisson counting distribution for a Markov process with constant rate is derived in a novel way and demonstrated using nuclear counting. Negative binomial statistics is demonstrated as a model for overdispersion and justified by the effect of electronic noise in nuclear counting. The statistics of digital packets on a computer network are shown to be compatible with the fractal-point stochastic process leading to a power-law as well as generalized inverse Gaussian density distributions of time intervals between packets.
NASA Astrophysics Data System (ADS)
Haven, Emmanuel; Khrennikov, Andrei
2013-01-01
Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.
Physical concepts in the development of constitutive equations
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1985-01-01
Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.
The epistemological status of general circulation models
NASA Astrophysics Data System (ADS)
Loehle, Craig
2018-03-01
Forecasts of both likely anthropogenic effects on climate and consequent effects on nature and society are based on large, complex software tools called general circulation models (GCMs). Forecasts generated by GCMs have been used extensively in policy decisions related to climate change. However, the relation between underlying physical theories and results produced by GCMs is unclear. In the case of GCMs, many discretizations and approximations are made, and simulating Earth system processes is far from simple and currently leads to some results with unknown energy balance implications. Statistical testing of GCM forecasts for degree of agreement with data would facilitate assessment of fitness for use. If model results need to be put on an anomaly basis due to model bias, then both visual and quantitative measures of model fit depend strongly on the reference period used for normalization, making testing problematic. Epistemology is here applied to problems of statistical inference during testing, the relationship between the underlying physics and the models, the epistemic meaning of ensemble statistics, problems of spatial and temporal scale, the existence or not of an unforced null for climate fluctuations, the meaning of existing uncertainty estimates, and other issues. Rigorous reasoning entails carefully quantifying levels of uncertainty.
Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs
NASA Astrophysics Data System (ADS)
Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.
2018-04-01
Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.
NASA Astrophysics Data System (ADS)
Dufoyer, A.; Lecoq, N.; Massei, N.; Marechal, J. C.
2017-12-01
Physics-based modeling of karst systems remains almost impossible without enough accurate information about the inner physical characteristics. Usually, the only available hydrodynamic information is the flow rate at the karst outlet. Numerous works in the past decades have used and proven the usefulness of time-series analysis and spectral techniques applied to spring flow, precipitations or even physico-chemical parameters, for interpreting karst hydrological functioning. However, identifying or interpreting the karst systems physical features that control statistical or spectral characteristics of spring flow variations is still challenging, not to say sometimes controversial. The main objective of this work is to determine how the statistical and spectral characteristics of the hydrodynamic signal at karst springs can be related to inner physical and hydraulic properties. In order to address this issue, we undertake an empirical approach based on the use of both distributed and physics-based models, and on synthetic systems responses. The first step of the research is to conduct a sensitivity analysis of time-series/spectral methods to karst hydraulic and physical properties. For this purpose, forward modeling of flow through several simple, constrained and synthetic cases in response to precipitations is undertaken. It allows us to quantify how the statistical and spectral characteristics of flow at the outlet are sensitive to changes (i) in conduit geometries, and (ii) in hydraulic parameters of the system (matrix/conduit exchange rate, matrix hydraulic conductivity and storativity). The flow differential equations resolved by MARTHE, a computer code developed by the BRGM, allows karst conduits modeling. From signal processing on simulated spring responses, we hope to determine if specific frequencies are always modified, thanks to Fourier series and multi-resolution analysis. We also hope to quantify which parameters are the most variable with auto-correlation analysis: first results seem to show higher variations due to conduit conductivity than the ones due to matrix/conduit exchange rate. Future steps will be using another computer code, based on double-continuum approach and allowing turbulent conduit flow, and modeling a natural system.
ERIC Educational Resources Information Center
Harris, Ronald M.
1978-01-01
Presents material dealing with an application of statistical thermodynamics to the diatomic solid I-2(s). The objective is to enhance the student's appreciation of the power of the statistical formulation of thermodynamics. The Simple Einstein Model is used. (Author/MA)
NASA Technical Reports Server (NTRS)
Bremner, Paul G.; Vazquez, Gabriel; Christiano, Daniel J.; Trout, Dawn H.
2016-01-01
Prediction of the maximum expected electromagnetic pick-up of conductors inside a realistic shielding enclosure is an important canonical problem for system-level EMC design of space craft, launch vehicles, aircraft and automobiles. This paper introduces a simple statistical power balance model for prediction of the maximum expected current in a wire conductor inside an aperture enclosure. It calculates both the statistical mean and variance of the immission from the physical design parameters of the problem. Familiar probability density functions can then be used to predict the maximum expected immission for deign purposes. The statistical power balance model requires minimal EMC design information and solves orders of magnitude faster than existing numerical models, making it ultimately viable for scaled-up, full system-level modeling. Both experimental test results and full wave simulation results are used to validate the foundational model.
NASA Astrophysics Data System (ADS)
Geil, Paul M.; Mutch, Simon J.; Poole, Gregory B.; Angel, Paul W.; Duffy, Alan R.; Mesinger, Andrei; Wyithe, J. Stuart B.
2016-10-01
We use the Dark-ages, Reionization And Galaxy formation Observables from Numerical Simulations (DRAGONS) framework to investigate the effect of galaxy formation physics on the morphology and statistics of ionized hydrogen (H II) regions during the Epoch of Reioinization (EoR). DRAGONS self-consistently couples a semi-analytic galaxy formation model with the inhomogeneous ionizing UV background, and can therefore be used to study the dependence of morphology and statistics of reionization on feedback phenomena of the ionizing source galaxy population. Changes in galaxy formation physics modify the sizes of H II regions and the amplitude and shape of 21-cm power spectra. Of the galaxy physics investigated, we find that supernova feedback plays the most important role in reionization, with H II regions up to ≈20 per cent smaller and a fractional difference in the amplitude of power spectra of up to ≈17 per cent at fixed ionized fraction in the absence of this feedback. We compare our galaxy formation-based reionization models with past calculations that assume constant stellar-to-halo mass ratios and find that with the correct choice of minimum halo mass, such models can mimic the predicted reionization morphology. Reionization morphology at fixed neutral fraction is therefore not uniquely determined by the details of galaxy formation, but is sensitive to the mass of the haloes hosting the bulk of the ionizing sources. Simple EoR parametrizations are therefore accurate predictors of reionization statistics. However, a complete understanding of reionization using future 21-cm observations will require interpretation with realistic galaxy formation models, in combination with other observations.
A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.
Revell, Christopher; Somveille, Marius
2017-08-29
In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.
Graphene growth process modeling: a physical-statistical approach
NASA Astrophysics Data System (ADS)
Wu, Jian; Huang, Qiang
2014-09-01
As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.
Statistical physics of human cooperation
NASA Astrophysics Data System (ADS)
Perc, Matjaž; Jordan, Jillian J.; Rand, David G.; Wang, Zhen; Boccaletti, Stefano; Szolnoki, Attila
2017-05-01
Extensive cooperation among unrelated individuals is unique to humans, who often sacrifice personal benefits for the common good and work together to achieve what they are unable to execute alone. The evolutionary success of our species is indeed due, to a large degree, to our unparalleled other-regarding abilities. Yet, a comprehensive understanding of human cooperation remains a formidable challenge. Recent research in the social sciences indicates that it is important to focus on the collective behavior that emerges as the result of the interactions among individuals, groups, and even societies. Non-equilibrium statistical physics, in particular Monte Carlo methods and the theory of collective behavior of interacting particles near phase transition points, has proven to be very valuable for understanding counterintuitive evolutionary outcomes. By treating models of human cooperation as classical spin models, a physicist can draw on familiar settings from statistical physics. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among humans often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. The complexity of solutions therefore often surpasses that observed in physical systems. Here we review experimental and theoretical research that advances our understanding of human cooperation, focusing on spatial pattern formation, on the spatiotemporal dynamics of observed solutions, and on self-organization that may either promote or hinder socially favorable states.
GPU-computing in econophysics and statistical physics
NASA Astrophysics Data System (ADS)
Preis, T.
2011-03-01
A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.
Airborne Wireless Communication Modeling and Analysis with MATLAB
2014-03-27
research develops a physical layer model that combines antenna modeling using computational electromagnetics and the two-ray propagation model to...predict the received signal strength. The antenna is modeled with triangular patches and analyzed by extending the antenna modeling algorithm by Sergey...7 2.7. Propagation Modeling : Statistical Models ............................................................8 2.8. Antenna Modeling
NASA Astrophysics Data System (ADS)
Gimenez, M. Cecilia; Paz García, Ana Pamela; Burgos Paci, Maxi A.; Reinaudi, Luis
2016-04-01
The evolution of public opinion using tools and concepts borrowed from Statistical Physics is an emerging area within the field of Sociophysics. In the present paper, a Statistical Physics model was developed to study the evolution of the ideological self-positioning of an ensemble of agents. The model consists of an array of L components, each one of which represents the ideology of an agent. The proposed mechanism is based on the ;voter model;, in which one agent can adopt the opinion of another one if the difference of their opinions lies within a certain range. The existence of ;undecided; agents (i.e. agents with no definite opinion) was implemented in the model. The possibility of radicalization of an agent's opinion upon interaction with another one was also implemented. The results of our simulations are compared to statistical data taken from the Latinobarómetro databank for the cases of Argentina, Chile, Brazil and Uruguay in the last decade. Among other results, the effect of taking into account the undecided agents is the formation of a single peak at the middle of the ideological spectrum (which corresponds to a centrist ideological position), in agreement with the real cases studied.
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
Linhart, S. Mike; Nania, Jon F.; Christiansen, Daniel E.; Hutchinson, Kasey J.; Sanders, Curtis L.; Archfield, Stacey A.
2013-01-01
A variety of individuals from water resource managers to recreational users need streamflow information for planning and decisionmaking at locations where there are no streamgages. To address this problem, two statistically based methods, the Flow Duration Curve Transfer method and the Flow Anywhere method, were developed for statewide application and the two physically based models, the Precipitation Runoff Modeling-System and the Soil and Water Assessment Tool, were only developed for application for the Cedar River Basin. Observed and estimated streamflows for the two methods and models were compared for goodness of fit at 13 streamgages modeled in the Cedar River Basin by using the Nash-Sutcliffe and the percent-bias efficiency values. Based on median and mean Nash-Sutcliffe values for the 13 streamgages the Precipitation Runoff Modeling-System and Soil and Water Assessment Tool models appear to have performed similarly and better than Flow Duration Curve Transfer and Flow Anywhere methods. Based on median and mean percent bias values, the Soil and Water Assessment Tool model appears to have generally overestimated daily mean streamflows, whereas the Precipitation Runoff Modeling-System model and statistical methods appear to have underestimated daily mean streamflows. The Flow Duration Curve Transfer method produced the lowest median and mean percent bias values and appears to perform better than the other models.
Dunlay, Shannon M.; Gheorghiade, Mihai; Reid, Kimberly J.; Allen, Larry A.; Chan, Paul S.; Hauptman, Paul J.; Zannad, Faiez; Maggioni, Aldo P.; Swedberg, Karl; Konstam, Marvin A.; Spertus, John A.
2010-01-01
Aims Hospitalized heart failure (HF) patients are at high risk for death and readmission. We examined the incremental value of data obtained 1 week after HF hospital discharge in predicting mortality and readmission. Methods and results In the Efficacy of Vasopressin Antagonism in Heart Failure Outcome Study with tolvaptan, 1528 hospitalized patients (ejection fraction ≤40%) with a physical examination, laboratories, and health status [Kansas City Cardiomyopathy Questionnaire (KCCQ)] assessments 1 week after discharge were included. The ability to predict 1 year cardiovascular rehospitalization and mortality was assessed with Cox models, c-statistics, and the integrated discrimination improvement (IDI). Not using a beta-blocker, rales, pedal oedema, hyponatraemia, lower creatinine clearance, higher brain natriuretic peptide, and worse health status were independent risk factors for rehospitalization and death. The c-statistic for the base model (history and medications) was 0.657. The model improved with physical examination, laboratory, and KCCQ results, with IDI increases of 4.9, 7.0, and 3.2%, respectively (P < 0.001 each). The combination of all three offered the greatest incremental gain (c-statistic 0.749; IDI increase 10.8%). Conclusion Physical examination, laboratories, and KCCQ assessed 1 week after discharge offer important prognostic information, suggesting that all are critical components of outpatient evaluation after HF hospitalization. PMID:20197265
NASA Astrophysics Data System (ADS)
Sellaoui, Lotfi; Mechi, Nesrine; Lima, Éder Cláudio; Dotto, Guilherme Luiz; Ben Lamine, Abdelmottaleb
2017-10-01
Based on statistical physics elements, the equilibrium adsorption of diclofenac (DFC) and nimesulide (NM) on activated carbon was analyzed by a multilayer model with saturation. The paper aimed to describe experimentally and theoretically the adsorption process and study the effect of adsorbate size using the model parameters. From numerical simulation, the number of molecules per site showed that the adsorbate molecules (DFC and NM) were mostly anchored in both sides of the pore walls. The receptor sites density increase suggested that additional sites appeared during the process, to participate in DFC and NM adsorption. The description of the adsorption energy behavior indicated that the process was physisorption. Finally, by a model parameters correlation, the size effect of the adsorbate was deduced indicating that the molecule dimension has a negligible effect on the DFC and NM adsorption.
May, Philip A; Tabachnick, Barbara G; Gossage, J Phillip; Kalberg, Wendy O; Marais, Anna-Susan; Robinson, Luther K; Manning, Melanie A; Blankenship, Jason; Buckley, David; Hoyme, H Eugene; Adnams, Colleen M
2013-06-01
To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASDs). Multivariate correlation techniques were used with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first-grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and used in structural equation models (SEMs) to assess correlates of child intelligence (verbal and nonverbal) and behavior. A first SEM using only 7 maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05) but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status [SES], and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model and were overpowered by SES and maternal physical traits. Although other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD.
A Hybrid Physics-Based Data-Driven Approach for Point-Particle Force Modeling
NASA Astrophysics Data System (ADS)
Moore, Chandler; Akiki, Georges; Balachandar, S.
2017-11-01
This study improves upon the physics-based pairwise interaction extended point-particle (PIEP) model. The PIEP model leverages a physical framework to predict fluid mediated interactions between solid particles. While the PIEP model is a powerful tool, its pairwise assumption leads to increased error in flows with high particle volume fractions. To reduce this error, a regression algorithm is used to model the differences between the current PIEP model's predictions and the results of direct numerical simulations (DNS) for an array of monodisperse solid particles subjected to various flow conditions. The resulting statistical model and the physical PIEP model are superimposed to construct a hybrid, physics-based data-driven PIEP model. It must be noted that the performance of a pure data-driven approach without the model-form provided by the physical PIEP model is substantially inferior. The hybrid model's predictive capabilities are analyzed using more DNS. In every case tested, the hybrid PIEP model's prediction are more accurate than those of physical PIEP model. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. DGE-1315138 and the U.S. DOE, NNSA, ASC Program, as a Cooperative Agreement under Contract No. DE-NA0002378.
Relating triggering processes in lab experiments with earthquakes.
NASA Astrophysics Data System (ADS)
Baro Urbea, J.; Davidsen, J.; Kwiatek, G.; Charalampidou, E. M.; Goebel, T.; Stanchits, S. A.; Vives, E.; Dresen, G.
2016-12-01
Statistical relations such as Gutenberg-Richter's, Omori-Utsu's and the productivity of aftershocks were first observed in seismology, but are also common to other physical phenomena exhibiting avalanche dynamics such as solar flares, rock fracture, structural phase transitions and even stock market transactions. All these examples exhibit spatio-temporal correlations that can be explained as triggering processes: Instead of being activated as a response to external driving or fluctuations, some events are consequence of previous activity. Although different plausible explanations have been suggested in each system, the ubiquity of such statistical laws remains unknown. However, the case of rock fracture may exhibit a physical connection with seismology. It has been suggested that some features of seismology have a microscopic origin and are reproducible over a vast range of scales. This hypothesis has motivated mechanical experiments to generate artificial catalogues of earthquakes at a laboratory scale -so called labquakes- and under controlled conditions. Microscopic fractures in lab tests release elastic waves that are recorded as ultrasonic (kHz-MHz) acoustic emission (AE) events by means of piezoelectric transducers. Here, we analyse the statistics of labquakes recorded during the failure of small samples of natural rocks and artificial porous materials under different controlled compression regimes. Temporal and spatio-temporal correlations are identified in certain cases. Specifically, we distinguish between the background and triggered events, revealing some differences in the statistical properties. We fit the data to statistical models of seismicity. As a particular case, we explore the branching process approach simplified in the Epidemic Type Aftershock Sequence (ETAS) model. We evaluate the empirical spatio-temporal kernel of the model and investigate the physical origins of triggering. Our analysis of the focal mechanisms implies that the occurrence of the empirical laws extends well beyond purely frictional sliding events, in contrast to what is often assumed.
A combination of routine blood analytes predicts fitness decrement in elderly endurance athletes.
Haslacher, Helmuth; Ratzinger, Franz; Perkmann, Thomas; Batmyagmar, Delgerdalai; Nistler, Sonja; Scherzer, Thomas M; Ponocny-Seliger, Elisabeth; Pilger, Alexander; Gerner, Marlene; Scheichenberger, Vanessa; Kundi, Michael; Endler, Georg; Wagner, Oswald F; Winker, Robert
2017-01-01
Endurance sports are enjoying greater popularity, particularly among new target groups such as the elderly. Predictors of future physical capacities providing a basis for training adaptations are in high demand. We therefore aimed to estimate the future physical performance of elderly marathoners (runners/bicyclists) using a set of easily accessible standard laboratory parameters. To this end, 47 elderly marathon athletes underwent physical examinations including bicycle ergometry and a blood draw at baseline and after a three-year follow-up period. In order to compile a statistical model containing baseline laboratory results allowing prediction of follow-up ergometry performance, the cohort was subgrouped into a model training (n = 25) and a test sample (n = 22). The model containing significant predictors in univariate analysis (alanine aminotransferase, urea, folic acid, myeloperoxidase and total cholesterol) presented with high statistical significance and excellent goodness of fit (R2 = 0.789, ROC-AUC = 0.951±0.050) in the model training sample and was validated in the test sample (ROC-AUC = 0.786±0.098). Our results suggest that standard laboratory parameters could be particularly useful for predicting future physical capacity in elderly marathoners. It hence merits further research whether these conclusions can be translated to other disciplines or age groups.
A combination of routine blood analytes predicts fitness decrement in elderly endurance athletes
Ratzinger, Franz; Perkmann, Thomas; Batmyagmar, Delgerdalai; Nistler, Sonja; Scherzer, Thomas M.; Ponocny-Seliger, Elisabeth; Pilger, Alexander; Gerner, Marlene; Scheichenberger, Vanessa; Kundi, Michael; Endler, Georg; Wagner, Oswald F.; Winker, Robert
2017-01-01
Endurance sports are enjoying greater popularity, particularly among new target groups such as the elderly. Predictors of future physical capacities providing a basis for training adaptations are in high demand. We therefore aimed to estimate the future physical performance of elderly marathoners (runners/bicyclists) using a set of easily accessible standard laboratory parameters. To this end, 47 elderly marathon athletes underwent physical examinations including bicycle ergometry and a blood draw at baseline and after a three-year follow-up period. In order to compile a statistical model containing baseline laboratory results allowing prediction of follow-up ergometry performance, the cohort was subgrouped into a model training (n = 25) and a test sample (n = 22). The model containing significant predictors in univariate analysis (alanine aminotransferase, urea, folic acid, myeloperoxidase and total cholesterol) presented with high statistical significance and excellent goodness of fit (R2 = 0.789, ROC-AUC = 0.951±0.050) in the model training sample and was validated in the test sample (ROC-AUC = 0.786±0.098). Our results suggest that standard laboratory parameters could be particularly useful for predicting future physical capacity in elderly marathoners. It hence merits further research whether these conclusions can be translated to other disciplines or age groups. PMID:28475643
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harlim, John, E-mail: jharlim@psu.edu; Mahdi, Adam, E-mail: amahdi@ncsu.edu; Majda, Andrew J., E-mail: jonjon@cims.nyu.edu
2014-01-15
A central issue in contemporary science is the development of nonlinear data driven statistical–dynamical models for time series of noisy partial observations from nature or a complex model. It has been established recently that ad-hoc quadratic multi-level regression models can have finite-time blow-up of statistical solutions and/or pathological behavior of their invariant measure. Recently, a new class of physics constrained nonlinear regression models were developed to ameliorate this pathological behavior. Here a new finite ensemble Kalman filtering algorithm is developed for estimating the state, the linear and nonlinear model coefficients, the model and the observation noise covariances from available partialmore » noisy observations of the state. Several stringent tests and applications of the method are developed here. In the most complex application, the perfect model has 57 degrees of freedom involving a zonal (east–west) jet, two topographic Rossby waves, and 54 nonlinearly interacting Rossby waves; the perfect model has significant non-Gaussian statistics in the zonal jet with blocked and unblocked regimes and a non-Gaussian skewed distribution due to interaction with the other 56 modes. We only observe the zonal jet contaminated by noise and apply the ensemble filter algorithm for estimation. Numerically, we find that a three dimensional nonlinear stochastic model with one level of memory mimics the statistical effect of the other 56 modes on the zonal jet in an accurate fashion, including the skew non-Gaussian distribution and autocorrelation decay. On the other hand, a similar stochastic model with zero memory levels fails to capture the crucial non-Gaussian behavior of the zonal jet from the perfect 57-mode model.« less
Ocean Surface Wave Optical Roughness - Innovative Measurement and Modeling
2008-01-01
e.g. Jessup and Phadnis , 2005) have been reported. Our effort seeks to provide a more comprehensive description of the physical and optical roughness...1986: Statistics of breaking waves observed as whitecaps in the open sea, Journal of Physical Oceanography, 16, 290-297. Jessup, A.T. and Phadnis
Statistical physics of interacting neural networks
NASA Astrophysics Data System (ADS)
Kinzel, Wolfgang; Metzler, Richard; Kanter, Ido
2001-12-01
Recent results on the statistical physics of time series generation and prediction are presented. A neural network is trained on quasi-periodic and chaotic sequences and overlaps to the sequence generator as well as the prediction errors are calculated numerically. For each network there exists a sequence for which it completely fails to make predictions. Two interacting networks show a transition to perfect synchronization. A pool of interacting networks shows good coordination in the minority game-a model of competition in a closed market. Finally, as a demonstration, a perceptron predicts bit sequences produced by human beings.
Statistical and engineering methods for model enhancement
NASA Astrophysics Data System (ADS)
Chang, Chia-Jung
Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.
Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models.
Hillier, John K; Kougioumtzoglou, Ioannis A; Stokes, Chris R; Smith, Michael J; Clark, Chris D; Spagnolo, Matteo S
2016-01-01
Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL). By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A 'stochastic instability' (SI) model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity). Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models.
Stochastic modeling of sunshine number data
NASA Astrophysics Data System (ADS)
Brabec, Marek; Paulescu, Marius; Badescu, Viorel
2013-11-01
In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation of Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.
Stochastic modeling of sunshine number data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brabec, Marek, E-mail: mbrabec@cs.cas.cz; Paulescu, Marius; Badescu, Viorel
2013-11-13
In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation ofmore » Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.« less
NASA Astrophysics Data System (ADS)
Xu, Y.; Jones, A. D.; Rhoades, A.
2017-12-01
Precipitation is a key component in hydrologic cycles, and changing precipitation regimes contribute to more intense and frequent drought and flood events around the world. Numerical climate modeling is a powerful tool to study climatology and to predict future changes. Despite the continuous improvement in numerical models, long-term precipitation prediction remains a challenge especially at regional scales. To improve numerical simulations of precipitation, it is important to find out where the uncertainty in precipitation simulations comes from. There are two types of uncertainty in numerical model predictions. One is related to uncertainty in the input data, such as model's boundary and initial conditions. These uncertainties would propagate to the final model outcomes even if the numerical model has exactly replicated the true world. But a numerical model cannot exactly replicate the true world. Therefore, the other type of model uncertainty is related the errors in the model physics, such as the parameterization of sub-grid scale processes, i.e., given precise input conditions, how much error could be generated by the in-precise model. Here, we build two statistical models based on a neural network algorithm to predict long-term variation of precipitation over California: one uses "true world" information derived from observations, and the other uses "modeled world" information using model inputs and outputs from the North America Coordinated Regional Downscaling Project (NA CORDEX). We derive multiple climate feature metrics as the predictors for the statistical model to represent the impact of global climate on local hydrology, and include topography as a predictor to represent the local control. We first compare the predictors between the true world and the modeled world to determine the errors contained in the input data. By perturbing the predictors in the statistical model, we estimate how much uncertainty in the model's final outcomes is accounted for by each predictor. By comparing the statistical model derived from true world information and modeled world information, we assess the errors lying in the physics of the numerical models. This work provides a unique insight to assess the performance of numerical climate models, and can be used to guide improvement of precipitation prediction.
Theory and analysis of statistical discriminant techniques as applied to remote sensing data
NASA Technical Reports Server (NTRS)
Odell, P. L.
1973-01-01
Classification of remote earth resources sensing data according to normed exponential density statistics is reported. The use of density models appropriate for several physical situations provides an exact solution for the probabilities of classifications associated with the Bayes discriminant procedure even when the covariance matrices are unequal.
Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John
2018-03-07
DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.
Estimating trends in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, Andrew; Moyer, Elisabeth J.; Stein, Michael L.
2017-06-01
Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the important characteristics of internal variability, can result in more accurate uncertainty statements about trends.
Current algebra, statistical mechanics and quantum models
NASA Astrophysics Data System (ADS)
Vilela Mendes, R.
2017-11-01
Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.
NASA Astrophysics Data System (ADS)
Zhao, Runchen; Ientilucci, Emmett J.
2017-05-01
Hyperspectral remote sensing systems provide spectral data composed of hundreds of narrow spectral bands. Spectral remote sensing systems can be used to identify targets, for example, without physical interaction. Often it is of interested to characterize the spectral variability of targets or objects. The purpose of this paper is to identify and characterize the LWIR spectral variability of targets based on an improved earth observing statistical performance model, known as the Forecasting and Analysis of Spectroradiometric System Performance (FASSP) model. FASSP contains three basic modules including a scene model, sensor model and a processing model. Instead of using mean surface reflectance only as input to the model, FASSP transfers user defined statistical characteristics of a scene through the image chain (i.e., from source to sensor). The radiative transfer model, MODTRAN, is used to simulate the radiative transfer based on user defined atmospheric parameters. To retrieve class emissivity and temperature statistics, or temperature / emissivity separation (TES), a LWIR atmospheric compensation method is necessary. The FASSP model has a method to transform statistics in the visible (ie., ELM) but currently does not have LWIR TES algorithm in place. This paper addresses the implementation of such a TES algorithm and its associated transformation of statistics.
Didarloo, Alireza; Shojaeizadeh, Davoud; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad
2011-10-01
Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease.
Statistical Modelling of the Soil Dielectric Constant
NASA Astrophysics Data System (ADS)
Usowicz, Boguslaw; Marczewski, Wojciech; Bogdan Usowicz, Jerzy; Lipiec, Jerzy
2010-05-01
The dielectric constant of soil is the physical property being very sensitive on water content. It funds several electrical measurement techniques for determining the water content by means of direct (TDR, FDR, and others related to effects of electrical conductance and/or capacitance) and indirect RS (Remote Sensing) methods. The work is devoted to a particular statistical manner of modelling the dielectric constant as the property accounting a wide range of specific soil composition, porosity, and mass density, within the unsaturated water content. Usually, similar models are determined for few particular soil types, and changing the soil type one needs switching the model on another type or to adjust it by parametrization of soil compounds. Therefore, it is difficult comparing and referring results between models. The presented model was developed for a generic representation of soil being a hypothetical mixture of spheres, each representing a soil fraction, in its proper phase state. The model generates a serial-parallel mesh of conductive and capacitive paths, which is analysed for a total conductive or capacitive property. The model was firstly developed to determine the thermal conductivity property, and now it is extended on the dielectric constant by analysing the capacitive mesh. The analysis is provided by statistical means obeying physical laws related to the serial-parallel branching of the representative electrical mesh. Physical relevance of the analysis is established electrically, but the definition of the electrical mesh is controlled statistically by parametrization of compound fractions, by determining the number of representative spheres per unitary volume per fraction, and by determining the number of fractions. That way the model is capable covering properties of nearly all possible soil types, all phase states within recognition of the Lorenz and Knudsen conditions. In effect the model allows on generating a hypothetical representative of the soil type, and that way it enables clear comparing to results from other soil type dependent models. The paper is focused on proper representing possible range of porosity in commonly existing soils. This work is done with aim of implementing the statistical-physical model of the dielectric constant to a use in the model CMEM (Community Microwave Emission Model), applicable to SMOS (Soil Moisture and Ocean Salinity ESA Mission) data. The input data to the model clearly accepts definition of soil fractions in common physical measures, and in opposition to other empirical models, does not need calibrating. It is not dependent on recognition of the soil by type, but instead it offers the control of accuracy by proper determination of the soil compound fractions. SMOS employs CMEM being funded only by the sand-clay-silt composition. Common use of the soil data, is split on tens or even hundreds soil types depending on the region. We hope that only by determining three element compounds of sand-clay-silt, in few fractions may help resolving the question of relevance of soil data to the input of CMEM, for SMOS. Now, traditionally employed soil types are converted on sand-clay-silt compounds, but hardly cover effects of other specific properties like the porosity. It should bring advantageous effects in validating SMOS observation data, and is taken for the aim in the Cal/Val project 3275, in the campaigns for SVRT (SMOS Validation and Retrieval Team). Acknowledgements. This work was funded in part by the PECS - Programme for European Cooperating States, No. 98084 "SWEX/R - Soil Water and Energy Exchange/Research".
Emergent dynamic structures and statistical law in spherical lattice gas automata.
Yao, Zhenwei
2017-12-01
Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.
Emergent dynamic structures and statistical law in spherical lattice gas automata
NASA Astrophysics Data System (ADS)
Yao, Zhenwei
2017-12-01
Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.
Statistical sensitivity analysis of a simple nuclear waste repository model
NASA Astrophysics Data System (ADS)
Ronen, Y.; Lucius, J. L.; Blow, E. M.
1980-06-01
A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.
Testing physical models for dipolar asymmetry with CMB polarization
NASA Astrophysics Data System (ADS)
Contreras, D.; Zibin, J. P.; Scott, D.; Banday, A. J.; Górski, K. M.
2017-12-01
The cosmic microwave background (CMB) temperature anisotropies exhibit a large-scale dipolar power asymmetry. To determine whether this is due to a real, physical modulation or is simply a large statistical fluctuation requires the measurement of new modes. Here we forecast how well CMB polarization data from Planck and future experiments will be able to confirm or constrain physical models for modulation. Fitting several such models to the Planck temperature data allows us to provide predictions for polarization asymmetry. While for some models and parameters Planck polarization will decrease error bars on the modulation amplitude by only a small percentage, we show, importantly, that cosmic-variance-limited (and in some cases even Planck) polarization data can decrease the errors by considerably better than the expectation of √{2 } based on simple ℓ-space arguments. We project that if the primordial fluctuations are truly modulated (with parameters as indicated by Planck temperature data) then Planck will be able to make a 2 σ detection of the modulation model with 20%-75% probability, increasing to 45%-99% when cosmic-variance-limited polarization is considered. We stress that these results are quite model dependent. Cosmic variance in temperature is important: combining statistically isotropic polarization with temperature data will spuriously increase the significance of the temperature signal with 30% probability for Planck.
2012-01-01
discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI models such as, the...detection and discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI...Shubitidze of Sky Research and Dartmouth College, conceived, implemented , and tested most of the approaches presented in this report. He developed
Forecasting defoliation by the gypsy moth in oak stands
Robert W. Campbell; Joseph P. Standaert
1974-01-01
A multiple-regression model is presented that reflects statistically significant correlations between defoliation by the gypsy moth, the dependent variable, and a series of biotic and physical independent variables. Both possible uses and shortcomings of this model are discussed.
AGARD Bulletin. Technical Programme, 1981.
1980-08-01
ionospheric effect models and their associated codes. Physical, statistical , and nybrid models will be described in a comprehensive manner. Descriptions...will be to review: The various conventional modes of optical correction required either by ametropias or by normal or pathological drops in visual
Current Status of Nuclear Physics Research
NASA Astrophysics Data System (ADS)
Bertulani, Carlos A.; Hussein, Mahir S.
2015-12-01
In this review, we discuss the current status of research in nuclear physics which is being carried out in different centers in the world. For this purpose, we supply a short account of the development in the area which evolved over the last nine decades, since the discovery of the neutron. The evolution of the physics of the atomic nucleus went through many stages as more data became available. We briefly discuss models introduced to discern the physics behind the experimental discoveries, such as the shell model, the collective model, the statistical model, the interacting boson model, etc., some of these models may be seemingly in conflict with each other, but this was shown to be only apparent. The richness of the ideas and abundance of theoretical models attests to the important fact that the nucleus is a really singular system in the sense that it evolves from two-body bound states such as the deuteron, to few-body bound states, such as 4He, 7Li, 9Be, etc. and up the ladder to heavier bound nuclei containing up to more than 200 nucleons. Clearly, statistical mechanics, usually employed in systems with very large number of particles, would seemingly not work for such finite systems as the nuclei, neither do other theories which are applicable to condensed matter. The richness of nuclear physics stems from these restrictions. New theories and models are presently being developed. Theories of the structure and reactions of neutron-rich and proton-rich nuclei, called exotic nuclei, halo nuclei, or Borromean nuclei, deal with the wealth of experimental data that became available in the last 35 years. Furthermore, nuclear astrophysics and stellar and Big Bang nucleosynthesis have become a more mature subject. Due to limited space, this review only covers a few selected topics, mainly those with which the authors have worked on. Our aimed potential readers of this review are nuclear physicists and physicists in other areas, as well as graduate students interested in pursuing a career in nuclear physics.
Attempting to physically explain space-time correlation of extremes
NASA Astrophysics Data System (ADS)
Bernardara, Pietro; Gailhard, Joel
2010-05-01
Spatial and temporal clustering of hydro-meteorological extreme events is scientific evidence. Moreover, the statistical parameters characterizing their local frequencies of occurrence show clear spatial patterns. Thus, in order to robustly assess the hydro-meteorological hazard, statistical models need to be able to take into account spatial and temporal dependencies. Statistical models considering long term correlation for quantifying and qualifying temporal and spatial dependencies are available, such as multifractal approach. Furthermore, the development of regional frequency analysis techniques allows estimating the frequency of occurrence of extreme events taking into account spatial patterns on the extreme quantiles behaviour. However, in order to understand the origin of spatio-temporal clustering, an attempt to find physical explanation should be done. Here, some statistical evidences of spatio-temporal correlation and spatial patterns of extreme behaviour are given on a large database of more than 400 rainfall and discharge series in France. In particular, the spatial distribution of multifractal and Generalized Pareto distribution parameters shows evident correlation patterns in the behaviour of frequency of occurrence of extremes. It is then shown that the identification of atmospheric circulation pattern (weather types) can physically explain the temporal clustering of extreme rainfall events (seasonality) and the spatial pattern of the frequency of occurrence. Moreover, coupling this information with the hydrological modelization of a watershed (as in the Schadex approach) an explanation of spatio-temporal distribution of extreme discharge can also be provided. We finally show that a hydro-meteorological approach (as the Schadex approach) can explain and take into account space and time dependencies of hydro-meteorological extreme events.
Data mining and statistical inference in selective laser melting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamath, Chandrika
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
Data mining and statistical inference in selective laser melting
Kamath, Chandrika
2016-01-11
Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less
NASA Astrophysics Data System (ADS)
Havens, S.; Marks, D. G.; Kormos, P.; Hedrick, A. R.; Johnson, M.; Robertson, M.; Sandusky, M.
2017-12-01
In the Western US, operational water supply managers rely on statistical techniques to forecast the volume of water left to enter the reservoirs. As the climate changes and the demand increases for stored water utilized for irrigation, flood control, power generation, and ecosystem services, water managers have begun to move from statistical techniques towards using physically based models. To assist with the transition, a new open source framework was developed, the Spatial Modeling for Resources Framework (SMRF), to automate and simplify the most common forcing data distribution methods. SMRF is computationally efficient and can be implemented for both research and operational applications. Currently, SMRF is able to generate all of the forcing data required to run physically based snow or hydrologic models at 50-100 m resolution over regions of 500-10,000 km2, and has been successfully applied in real time and historical applications for the Boise River Basin in Idaho, USA, the Tuolumne River Basin and San Joaquin in California, USA, and Reynolds Creek Experimental Watershed in Idaho, USA. These applications use meteorological station measurements and numerical weather prediction model outputs as input data. SMRF has significantly streamlined the modeling workflow, decreased model set up time from weeks to days, and made near real-time application of physics-based snow and hydrologic models possible.
Towards physical principles of biological evolution
NASA Astrophysics Data System (ADS)
Katsnelson, Mikhail I.; Wolf, Yuri I.; Koonin, Eugene V.
2018-03-01
Biological systems reach organizational complexity that far exceeds the complexity of any known inanimate objects. Biological entities undoubtedly obey the laws of quantum physics and statistical mechanics. However, is modern physics sufficient to adequately describe, model and explain the evolution of biological complexity? Detailed parallels have been drawn between statistical thermodynamics and the population-genetic theory of biological evolution. Based on these parallels, we outline new perspectives on biological innovation and major transitions in evolution, and introduce a biological equivalent of thermodynamic potential that reflects the innovation propensity of an evolving population. Deep analogies have been suggested to also exist between the properties of biological entities and processes, and those of frustrated states in physics, such as glasses. Such systems are characterized by frustration whereby local state with minimal free energy conflict with the global minimum, resulting in ‘emergent phenomena’. We extend such analogies by examining frustration-type phenomena, such as conflicts between different levels of selection, in biological evolution. These frustration effects appear to drive the evolution of biological complexity. We further address evolution in multidimensional fitness landscapes from the point of view of percolation theory and suggest that percolation at level above the critical threshold dictates the tree-like evolution of complex organisms. Taken together, these multiple connections between fundamental processes in physics and biology imply that construction of a meaningful physical theory of biological evolution might not be a futile effort. However, it is unrealistic to expect that such a theory can be created in one scoop; if it ever comes to being, this can only happen through integration of multiple physical models of evolutionary processes. Furthermore, the existing framework of theoretical physics is unlikely to suffice for adequate modeling of the biological level of complexity, and new developments within physics itself are likely to be required.
Magnetorotational dynamo chimeras. The missing link to turbulent accretion disk dynamo models?
NASA Astrophysics Data System (ADS)
Riols, A.; Rincon, F.; Cossu, C.; Lesur, G.; Ogilvie, G. I.; Longaretti, P.-Y.
2017-02-01
In Keplerian accretion disks, turbulence and magnetic fields may be jointly excited through a subcritical dynamo mechanisminvolving magnetorotational instability (MRI). This dynamo may notably contribute to explaining the time-variability of various accreting systems, as high-resolution simulations of MRI dynamo turbulence exhibit statistical self-organization into large-scale cyclic dynamics. However, understanding the physics underlying these statistical states and assessing their exact astrophysical relevance is theoretically challenging. The study of simple periodic nonlinear MRI dynamo solutions has recently proven useful in this respect, and has highlighted the role of turbulent magnetic diffusion in the seeming impossibility of a dynamo at low magnetic Prandtl number (Pm), a common regime in disks. Arguably though, these simple laminar structures may not be fully representative of the complex, statistically self-organized states expected in astrophysical regimes. Here, we aim at closing this seeming discrepancy by reporting the numerical discovery of exactly periodic, yet semi-statistical "chimeral MRI dynamo states" which are the organized outcome of a succession of MRI-unstable, non-axisymmetric dynamical stages of different forms and amplitudes. Interestingly, these states, while reminiscent of the statistical complexity of turbulent simulations, involve the same physical principles as simpler laminar cycles, and their analysis further confirms the theory that subcritical turbulent magnetic diffusion impedes the sustainment of an MRI dynamo at low Pm. Overall, chimera dynamo cycles therefore offer an unprecedented dual physical and statistical perspective on dynamos in rotating shear flows, which may prove useful in devising more accurate, yet intuitive mean-field models of time-dependent turbulent disk dynamos. Movies associated to Fig. 1 are available at http://www.aanda.org
An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.
2017-01-01
Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.
May, Philip A.; Tabachnick, Barbara G.; Gossage, J. Phillip; Kalberg, Wendy O.; Marais, Anna-Susan; Robinson, Luther K.; Manning, Melanie A.; Blankenship, Jason; Buckley, David; Hoyme, H. Eugene; Adnams, Colleen M.
2013-01-01
Objective To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASD). Method Multivariate correlation techniques were employed with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and employed in structural equation models (SEM) to assess correlates of child intelligence (verbal and non-verbal) and behavior. Results A first SEM utilizing only seven maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05), but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status (SES), and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model, and were overpowered by SES and maternal physical traits. Conclusions While other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly-controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD. PMID:23751886
Learning physical descriptors for materials science by compressed sensing
NASA Astrophysics Data System (ADS)
Ghiringhelli, Luca M.; Vybiral, Jan; Ahmetcik, Emre; Ouyang, Runhai; Levchenko, Sergey V.; Draxl, Claudia; Scheffler, Matthias
2017-02-01
The availability of big data in materials science offers new routes for analyzing materials properties and functions and achieving scientific understanding. Finding structure in these data that is not directly visible by standard tools and exploitation of the scientific information requires new and dedicated methodology based on approaches from statistical learning, compressed sensing, and other recent methods from applied mathematics, computer science, statistics, signal processing, and information science. In this paper, we explain and demonstrate a compressed-sensing based methodology for feature selection, specifically for discovering physical descriptors, i.e., physical parameters that describe the material and its properties of interest, and associated equations that explicitly and quantitatively describe those relevant properties. As showcase application and proof of concept, we describe how to build a physical model for the quantitative prediction of the crystal structure of binary compound semiconductors.
Scientific computations section monthly report, November 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckner, M.R.
1993-12-30
This progress report from the Savannah River Technology Center contains abstracts from papers from the computational modeling, applied statistics, applied physics, experimental thermal hydraulics, and packaging and transportation groups. Specific topics covered include: engineering modeling and process simulation, criticality methods and analysis, plutonium disposition.
Accelerometry-based classification of human activities using Markov modeling.
Mannini, Andrea; Sabatini, Angelo Maria
2011-01-01
Accelerometers are a popular choice as body-motion sensors: the reason is partly in their capability of extracting information that is useful for automatically inferring the physical activity in which the human subject is involved, beside their role in feeding biomechanical parameters estimators. Automatic classification of human physical activities is highly attractive for pervasive computing systems, whereas contextual awareness may ease the human-machine interaction, and in biomedicine, whereas wearable sensor systems are proposed for long-term monitoring. This paper is concerned with the machine learning algorithms needed to perform the classification task. Hidden Markov Model (HMM) classifiers are studied by contrasting them with Gaussian Mixture Model (GMM) classifiers. HMMs incorporate the statistical information available on movement dynamics into the classification process, without discarding the time history of previous outcomes as GMMs do. An example of the benefits of the obtained statistical leverage is illustrated and discussed by analyzing two datasets of accelerometer time series.
The mean time-limited crash rate of stock price
NASA Astrophysics Data System (ADS)
Li, Yun-Xian; Li, Jiang-Cheng; Yang, Ai-Jun; Tang, Nian-Sheng
2017-05-01
In this article we investigate the occurrence of stock market crash in an economy cycle. Bayesian approach, Heston model and statistical-physical method are considered. Specifically, Heston model and an effective potential are employed to address the dynamic changes of stock price. Bayesian approach has been utilized to estimate the Heston model's unknown parameters. Statistical physical method is used to investigate the occurrence of stock market crash by calculating the mean time-limited crash rate. The real financial data from the Shanghai Composite Index is analyzed with the proposed methods. The mean time-limited crash rate of stock price is used to describe the occurrence of stock market crash in an economy cycle. The monotonous and nonmonotonous behaviors are observed in the behavior of the mean time-limited crash rate versus volatility of stock for various cross correlation coefficient between volatility and price. Also a minimum occurrence of stock market crash matching an optimal volatility is discovered.
Mäki-Opas, Tomi E; de Munter, Jeroen; Maas, Jolanda; den Hertog, Frank; Kunst, Anton E
2014-08-01
This study examined the effect of physical environment on cycling to and from school among boys and girls of Turkish and Moroccan origin living in Amsterdam. The LASER study (n = 697) was an interview study that included information on cycling to and from school and the perceived physical environment. Objective information on physical environment was gathered from Statistics Netherlands and the Department for Research and Statistics at the Municipality of Amsterdam. Structural equation modelling with latent variables was applied, taking into account age, gender, self-assessed health, education, country of origin, and distance to school. For every unit increase in the latent variable scale for bicycle-friendly infrastructure, we observed a 21% increase in the odds for cycling to and from school. The association was only borderline statistically significant and disappeared after controlling for distance to school. The enjoyable environment was not associated with cycling to and from school after controlling for all background factors. Bicycle-friendly infrastructure and an enjoyable environment were not important factors for cycling to and from school among those with no cultural cycling background.
Statistical-physical model of the hydraulic conductivity
NASA Astrophysics Data System (ADS)
Usowicz, B.; Marczewski, W.; Usowicz, J. B.; Lukowski, M. I.
2012-04-01
The water content in unsaturated subsurface soil layer is determined by processes of exchanging mass and energy between media of soil and atmosphere, and particular members of layered media. Generally they are non-homogeneous on different scales, considering soil porosity, soil texture including presence of vegetation elements in the root zone, and canopy above the surface, and varying biomass density of plants above the surface in clusters. That heterogeneity determines statistically effective values of particular physical properties. This work considers mainly those properties which determine the hydraulic conductivity of soil. This property is necessary for characterizing physically water transfer in the root zone and access of nutrient matter for plants, but it also the water capacity on the field scale. The temporal variability of forcing conditions and evolutionarily changing vegetation causes substantial effects of impact on the water capacity in large scales, bringing the evolution of water conditions in the entire area, spanning a possible temporal state in the range between floods and droughts. The dynamic of this evolution of water conditions is highly determined by vegetation but is hardly predictable in evaluations. Hydrological models require feeding with input data determining hydraulic properties of the porous soil which are proposed in this paper by means of the statistical-physical model of the water hydraulic conductivity. The statistical-physical model was determined for soils being typical in Euroregion Bug, Eastern Poland. The model is calibrated on the base of direct measurements in the field scales, and enables determining typical characteristics of water retention by the retention curves bounding the hydraulic conductivity to the state of water saturation of the soil. The values of the hydraulic conductivity in two reference states are used for calibrating the model. One is close to full saturation, and another is for low water content far from saturation, in a particular case of the soil type. Effects of calibrating a soil depends on assumed ranges of soil properties engaged to recognizing the soil type. Among those properties, the key role is for the bulk density, the porosity and its dependence on the specific area of the soil. The aim of this work is to provide such variables of auxiliary data to SMOS, which would bring a relation of the soil moisture to the water capacity, under retrieving SM from SMOS L1C data. * The work was financially supported in part by the ESA Programme for European Cooperating States (PECS), No.98084 "SWEX-R, Soil Water and Energy Exchange/Research", AO3275.
Asymptotic formulae for likelihood-based tests of new physics
NASA Astrophysics Data System (ADS)
Cowan, Glen; Cranmer, Kyle; Gross, Eilam; Vitells, Ofer
2011-02-01
We describe likelihood-based statistical tests for use in high energy physics for the discovery of new phenomena and for construction of confidence intervals on model parameters. We focus on the properties of the test procedures that allow one to account for systematic uncertainties. Explicit formulae for the asymptotic distributions of test statistics are derived using results of Wilks and Wald. We motivate and justify the use of a representative data set, called the "Asimov data set", which provides a simple method to obtain the median experimental sensitivity of a search or measurement as well as fluctuations about this expectation.
NASA Astrophysics Data System (ADS)
Segou, Margarita
2016-01-01
I perform a retrospective forecast experiment in the most rapid extensive continental rift worldwide, the western Corinth Gulf (wCG, Greece), aiming to predict shallow seismicity (depth <15 km) with magnitude M ≥ 3.0 for the time period between 1995 and 2013. I compare two short-term earthquake clustering models, based on epidemic-type aftershock sequence (ETAS) statistics, four physics-based (CRS) models, combining static stress change estimations and the rate-and-state laboratory law and one hybrid model. For the latter models, I incorporate the stress changes imparted from 31 earthquakes with magnitude M ≥ 4.5 at the extended area of wCG. Special attention is given on the 3-D representation of active faults, acting as potential receiver planes for the estimation of static stress changes. I use reference seismicity between 1990 and 1995, corresponding to the learning phase of physics-based models, and I evaluate the forecasts for six months following the 1995 M = 6.4 Aigio earthquake using log-likelihood performance metrics. For the ETAS realizations, I use seismic events with magnitude M ≥ 2.5 within daily update intervals to enhance their predictive power. For assessing the role of background seismicity, I implement a stochastic reconstruction (aka declustering) aiming to answer whether M > 4.5 earthquakes correspond to spontaneous events and identify, if possible, different triggering characteristics between aftershock sequences and swarm-type seismicity periods. I find that: (1) ETAS models outperform CRS models in most time intervals achieving very low rejection ratio RN = 6 per cent, when I test their efficiency to forecast the total number of events inside the study area, (2) the best rejection ratio for CRS models reaches RN = 17 per cent, when I use varying target depths and receiver plane geometry, (3) 75 per cent of the 1995 Aigio aftershocks that occurred within the first month can be explained by static stress changes, (4) highly variable performance on behalf of both statistical and physical models is suggested by large confidence intervals of information gain per earthquake and (5) generic ETAS models can adequately predict the temporal evolution of seismicity during swarms. Furthermore, stochastic reconstruction of seismicity makes possible the identification of different triggering processes between specific seismic crises (2001, 2003-04, 2006-07) and the 1995 aftershock sequence. I find that: (1) seismic events with M ≥ 5.0 are not a part of a preceding earthquake cascade, since they are characterized by high probability being a background event (average Pback > 0.8) and (2) triggered seismicity within swarms is characterized by lower event productivity when compared with the corresponding value during aftershock sequences. I conclude that physics-based models contribute on the determination of the `new-normal' seismicity rate at longer time intervals and that their joint implementation with statistical models is beneficial for future operational forecast systems.
Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models
Kougioumtzoglou, Ioannis A.; Stokes, Chris R.; Smith, Michael J.; Clark, Chris D.; Spagnolo, Matteo S.
2016-01-01
Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL). By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A ‘stochastic instability’ (SI) model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity). Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models. PMID:27458921
The applications of statistical quantification techniques in nanomechanics and nanoelectronics.
Mai, Wenjie; Deng, Xinwei
2010-10-08
Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f(0). The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 +/- 0.0274 x 10( - 5) ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.
Sun, Xingshu; Silverman, Timothy; Garris, Rebekah; ...
2016-07-18
In this study, we present a physics-based analytical model for copper indium gallium diselenide (CIGS) solar cells that describes the illumination- and temperature-dependent current-voltage (I-V) characteristics and accounts for the statistical shunt variation of each cell. The model is derived by solving the drift-diffusion transport equation so that its parameters are physical and, therefore, can be obtained from independent characterization experiments. The model is validated against CIGS I-V characteristics as a function of temperature and illumination intensity. This physics-based model can be integrated into a large-scale simulation framework to optimize the performance of solar modules, as well as predict themore » long-term output yields of photovoltaic farms under different environmental conditions.« less
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
Retrieval of Atmospheric Particulate Matter Using Satellite Data Over Central and Eastern China
NASA Astrophysics Data System (ADS)
Chen, G. L.; Guang, J.; Li, Y.; Che, Y. H.; Gong, S. Q.
2018-04-01
Fine particulate matter (PM2.5) is a particle cluster with diameters less than or equal to 2.5 μm. Over the past few decades, regional air pollution composed of PM2.5 has frequently occurred over Central and Eastern China. In order to estimate the concentration, distribution and other properties of PM2.5, the general retrieval models built by establishing the relationship between aerosol optical depth (AOD) and PM2.5 has been widely used in many studies, including experimental models via statistics analysis and physical models with certain physical mechanism. The statistical experimental models can't be extended to other areas or historical period due to its dependence on the ground-based observations and necessary auxiliary data, which limits its further application. In this paper, a physically based model is applied to estimate the concentration of PM2.5 over Central and Eastern China from 2007 to 2016. The ground-based PM2.5 measurements were used to be as reference data to validate our retrieval results. Then annual variation and distribution of PM2.5 concentration in the Central and Eastern China was analysed. Results shows that the annual average PM2.5 show a trend of gradually increasing and then decreasing during 2007-2016, with the highest value in 2011.
Modeling of spacecraft charging
NASA Technical Reports Server (NTRS)
Whipple, E. C., Jr.
1977-01-01
Three types of modeling of spacecraft charging are discussed: statistical models, parametric models, and physical models. Local time dependence of circuit upset for DoD and communication satellites, and electron current to a sphere with an assumed Debye potential distribution are presented. Four regions were involved in spacecraft charging: (1) undisturbed plasma, (2) plasma sheath region, (3) spacecraft surface, and (4) spacecraft equivalent circuit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
NASA Astrophysics Data System (ADS)
Mazzitello, Karina I.; Candia, Julián
2012-12-01
In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.
NASA Astrophysics Data System (ADS)
Sellaoui, Lotfi; Lima, Éder Cláudio; Dotto, Guilherme Luiz; Dias, Silvio L. P.; Ben Lamine, Abdelmottaleb
Two equilibrium models based on statistical physics, i.e., monolayer model with single energy and multilayer model with saturation, were developed and employed to access the steric and energetic aspects in the adsorption of reactive violet 5 dye (RV-5) on cocoa shell activated carbon (AC) and commercial activated carbon (CAC), at different temperatures (from 298 to 323 K). The results showed that the multilayer model with saturation was able to represent the adsorption system. This model assumes that the adsorption occurs by a formation of certain number of layers. The n values ranged from 1.10 to 2.98, indicating that the adsorbate molecules interacted in an inclined position on the adsorbent surface and aggregate in solution. The study of the total number of the formed layers (1 + L2) showed that the steric hindrance is the dominant factor. The description of the adsorbate-adsorbent interactions by calculation of the adsorption energy indicated that the process occurred by physisorption in nature, since the values were lower than 40 kJ mol-1.
Didarloo, Alireza; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad
2011-01-01
Background Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. Methods A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. Results The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Conclusion Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease. PMID:22111043
University Students' Conceptions of an Excellent Physical Education Teacher in China
ERIC Educational Resources Information Center
Song, Lihua; Chen, Junjun
2013-01-01
This study investigated how university students perceive an excellent physical education (PE) teacher at the university level. A sample of 2000 university students at four universities in China responded to a 53-item questionnaire. A 6-factor, 28-item model of an excellent teacher in PE was subsequently generated with a statistical good fit, using…
Simulating flaring events in complex active regions driven by observed magnetograms
NASA Astrophysics Data System (ADS)
Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.
2011-05-01
Context. We interpret solar flares as events originating in active regions that have reached the self organized critical state, by using a refined cellular automaton model with initial conditions derived from observations. Aims: We investigate whether the system, with its imposed physical elements, reaches a self organized critical state and whether well-known statistical properties of flares, such as scaling laws observed in the distribution functions of characteristic parameters, are reproduced after this state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy and event duration follow the expected scaling laws, we first applied a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent loading and relaxation steps lead the system to self organized criticality, after which the statistical properties of the simulated events are examined. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately imposed on all elements of the model. Results: Our results show that self organized criticality is indeed reached when applying specific loading and relaxation rules. Power-law indices obtained from the distribution functions of the modeled flaring events are in good agreement with observations. Single power laws (peak and total flare energy) are obtained, as are power laws with exponential cutoff and double power laws (flare duration). The results are also compared with observational X-ray data from the GOES satellite for our active-region sample. Conclusions: We conclude that well-known statistical properties of flares are reproduced after the system has reached self organized criticality. A significant enhancement of our refined cellular automaton model is that it commences the simulation from observed vector magnetograms, thus facilitating energy calculation in physical units. The model described in this study remains consistent with fundamental physical requirements, and imposes physically meaningful driving and redistribution rules.
Statistics of Statisticians: Critical Mass of Statistics and Operational Research Groups
NASA Astrophysics Data System (ADS)
Kenna, Ralph; Berche, Bertrand
Using a recently developed model, inspired by mean field theory in statistical physics, and data from the UK's Research Assessment Exercise, we analyse the relationship between the qualities of statistics and operational research groups and the quantities of researchers in them. Similar to other academic disciplines, we provide evidence for a linear dependency of quality on quantity up to an upper critical mass, which is interpreted as the average maximum number of colleagues with whom a researcher can communicate meaningfully within a research group. The model also predicts a lower critical mass, which research groups should strive to achieve to avoid extinction. For statistics and operational research, the lower critical mass is estimated to be 9 ± 3. The upper critical mass, beyond which research quality does not significantly depend on group size, is 17 ± 6.
Landau's statistical mechanics for quasi-particle models
NASA Astrophysics Data System (ADS)
Bannur, Vishnu M.
2014-04-01
Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.
Social-relational risk factors for predicting elder physical abuse: an ecological bi-focal model.
von Heydrich, Levente; Schiamberg, Lawrence B; Chee, Grace
2012-01-01
Annually in the United States, 1 to 5 million older adults, 65 and above, are physically or sexually injured or mistreated by their caregivers in family settings. This study examined the prevalence and risk factors involved in elder physical abuse by adult child caregivers, moving from the immediate elderly parent/adult child relationship context to more distal social support contexts, utilizing a subsample of 203 elderly participants from the Midlife Development in the United States study (MIDUS II, 2004-2006). LISREL modeling examined causal pathways between elderly demographic characteristics, physical/emotional health, and behavioral and contextual characteristics from an ecological perspective. Data modeling was accomplished using Mplus, PAXW, and SYSTAT statistical software packages. Results indicate that latent factors including older adult health, social isolation of the older adult, and adult child characteristics were significantly associated with elder physical abuse, as mediated by the quality of the elderly parent/adult child relationship.
Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models
Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.
2011-01-01
We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.
A rigidity transition and glassy dynamics in a model for confluent 3D tissues
NASA Astrophysics Data System (ADS)
Merkel, Matthias; Manning, M. Lisa
The origin of rigidity in disordered materials is an outstanding open problem in statistical physics. Recently, a new type of rigidity transition was discovered in a family of models for 2D biological tissues, but the mechanisms responsible for rigidity remain unclear. This is not just a statistical physics problem, but also relevant for embryonic development, cancer growth, and wound healing. To gain insight into this rigidity transition and make new predictions about biological bulk tissues, we have developed a fully 3D self-propelled Voronoi (SPV) model. The model takes into account shape, elasticity, and self-propelled motion of the individual cells. We find that in the absence of self-propulsion, this model exhibits a rigidity transition that is controlled by a dimensionless model parameter describing the preferred cell shape, with an accompanying structural order parameter. In the presence of self-propulsion, the rigidity transition appears as a glass-like transition featuring caging and aging effects. Given the similarities between this transition and jamming in particulate solids, it is natural to ask if the two transitions are related. By comparing statistics of Voronoi geometries, we show the transitions are surprisingly close but demonstrably distinct. Furthermore, an index theorem used to identify topologically protected mechanical modes in jammed systems can be extended to these vertex-type models. In our model, residual stresses govern the transition and enter the index theorem in a different way compared to jammed particles, suggesting the origin of rigidity may be different between the two.
Statistical Physics of Adaptation
2016-08-23
Statistical Physics of Adaptation Nikolay Perunov, Robert A. Marsland, and Jeremy L. England Department of Physics , Physics of Living Systems Group...Subject Areas: Biological Physics , Complex Systems, Statistical Physics I. INTRODUCTION It has long been understood that nonequilibrium driving can...equilibrium may appear to have been specially selected for physical properties connected to their ability to absorb work from the particular driving environment
NASA Astrophysics Data System (ADS)
Rodríguez, Nancy
2015-03-01
The use of mathematical tools has long proved to be useful in gaining understanding of complex systems in physics [1]. Recently, many researchers have realized that there is an analogy between emerging phenomena in complex social systems and complex physical or biological systems [4,5,12]. This realization has particularly benefited the modeling and understanding of crime, a ubiquitous phenomena that is far from being understood. In fact, when one is interested in the bulk behavior of patterns that emerge from small and seemingly unrelated interactions as well as decisions that occur at the individual level, the mathematical tools that have been developed in statistical physics, game theory, network theory, dynamical systems, and partial differential equations can be useful in shedding light into the dynamics of these patterns [2-4,6,12].
Statistical physics of crime: a review.
D'Orsogna, Maria R; Perc, Matjaž
2015-03-01
Containing the spread of crime in urban societies remains a major challenge. Empirical evidence suggests that, if left unchecked, crimes may be recurrent and proliferate. On the other hand, eradicating a culture of crime may be difficult, especially under extreme social circumstances that impair the creation of a shared sense of social responsibility. Although our understanding of the mechanisms that drive the emergence and diffusion of crime is still incomplete, recent research highlights applied mathematics and methods of statistical physics as valuable theoretical resources that may help us better understand criminal activity. We review different approaches aimed at modeling and improving our understanding of crime, focusing on the nucleation of crime hotspots using partial differential equations, self-exciting point process and agent-based modeling, adversarial evolutionary games, and the network science behind the formation of gangs and large-scale organized crime. We emphasize that statistical physics of crime can relevantly inform the design of successful crime prevention strategies, as well as improve the accuracy of expectations about how different policing interventions should impact malicious human activity that deviates from social norms. We also outline possible directions for future research, related to the effects of social and coevolving networks and to the hierarchical growth of criminal structures due to self-organization. Copyright © 2014 Elsevier B.V. All rights reserved.
Shen, Tongye; Gnanakaran, S
2009-04-22
A critical roadblock to the production of biofuels from lignocellulosic biomass is the efficient degradation of crystalline microfibrils of cellulose to glucose. A microscopic understanding of how different physical conditions affect the overall stability of the crystalline structure of microfibrils could facilitate the design of more effective protocols for their degradation. One of the essential physical interactions that stabilizes microfibrils is a network of hydrogen (H) bonds: both intrachain H-bonds between neighboring monomers of a single cellulose polymer chain and interchain H-bonds between adjacent chains. We construct a statistical mechanical model of cellulose assembly at the resolution of explicit hydrogen-bond networks. Using the transfer matrix method, the partition function and the subsequent statistical properties are evaluated. With the help of this lattice-based model, we capture the plasticity of the H-bond network in cellulose due to frustration and redundancy in the placement of H-bonds. This plasticity is responsible for the stability of cellulose over a wide range of temperatures. Stable intrachain and interchain H-bonds are identified as a function of temperature that could possibly be manipulated toward rational destruction of crystalline cellulose.
Lindberg, Ann-Sofie; Oksa, Juha; Antti, Henrik; Malm, Christer
2015-01-01
Physical capacity has previously been deemed important for firefighters physical work capacity, and aerobic fitness, muscular strength, and muscular endurance are the most frequently investigated parameters of importance. Traditionally, bivariate and multivariate linear regression statistics have been used to study relationships between physical capacities and work capacities among firefighters. An alternative way to handle datasets consisting of numerous correlated variables is to use multivariate projection analyses, such as Orthogonal Projection to Latent Structures. The first aim of the present study was to evaluate the prediction and predictive power of field and laboratory tests, respectively, on firefighters' physical work capacity on selected work tasks. Also, to study if valid predictions could be achieved without anthropometric data. The second aim was to externally validate selected models. The third aim was to validate selected models on firefighters' and on civilians'. A total of 38 (26 men and 12 women) + 90 (38 men and 52 women) subjects were included in the models and the external validation, respectively. The best prediction (R2) and predictive power (Q2) of Stairs, Pulling, Demolition, Terrain, and Rescue work capacities included field tests (R2 = 0.73 to 0.84, Q2 = 0.68 to 0.82). The best external validation was for Stairs work capacity (R2 = 0.80) and worst for Demolition work capacity (R2 = 0.40). In conclusion, field and laboratory tests could equally well predict physical work capacities for firefighting work tasks, and models excluding anthropometric data were valid. The predictive power was satisfactory for all included work tasks except Demolition.
NASA Astrophysics Data System (ADS)
Benedict, James J.; Medeiros, Brian; Clement, Amy C.; Pendergrass, Angeline G.
2017-06-01
Precipitation distributions and extremes play a fundamental role in shaping Earth's climate and yet are poorly represented in many global climate models. Here, a suite of idealized Community Atmosphere Model (CAM) aquaplanet simulations is examined to assess the aquaplanet's ability to reproduce hydroclimate statistics of real-Earth configurations and to investigate sensitivities of precipitation distributions and extremes to model physics, horizontal grid resolution, and ocean type. Little difference in precipitation statistics is found between aquaplanets using time-constant sea-surface temperatures and those implementing a slab ocean model with a 50 m mixed-layer depth. In contrast, CAM version 5.3 (CAM5.3) produces more time mean, zonally averaged precipitation than CAM version 4 (CAM4), while CAM4 generates significantly larger precipitation variance and frequencies of extremely intense precipitation events. The largest model configuration-based precipitation sensitivities relate to choice of horizontal grid resolution in the selected range 1-2°. Refining grid resolution has significant physics-dependent effects on tropical precipitation: for CAM4, time mean zonal mean precipitation increases along the Equator and the intertropical convergence zone (ITCZ) narrows, while for CAM5.3 precipitation decreases along the Equator and the twin branches of the ITCZ shift poleward. Increased grid resolution also reduces light precipitation frequencies and enhances extreme precipitation for both CAM4 and CAM5.3 resulting in better alignment with observational estimates. A discussion of the potential implications these hydrologic cycle sensitivities have on the interpretation of precipitation statistics in future climate projections is also presented.
Statistical dependency in visual scanning
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Stark, Lawrence
1986-01-01
A method to identify statistical dependencies in the positions of eye fixations is developed and applied to eye movement data from subjects who viewed dynamic displays of air traffic and judged future relative position of aircraft. Analysis of approximately 23,000 fixations on points of interest on the display identified statistical dependencies in scanning that were independent of the physical placement of the points of interest. Identification of these dependencies is inconsistent with random-sampling-based theories used to model visual search and information seeking.
Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs
Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara
2017-01-01
Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...
NASA Technical Reports Server (NTRS)
Forbes, G. S.; Pielke, R. A.
1985-01-01
Various empirical and statistical weather-forecasting studies which utilize stratification by weather regime are described. Objective classification was used to determine weather regime in some studies. In other cases the weather pattern was determined on the basis of a parameter representing the physical and dynamical processes relevant to the anticipated mesoscale phenomena, such as low level moisture convergence and convective precipitation, or the Froude number and the occurrence of cold-air damming. For mesoscale phenomena already in existence, new forecasting techniques were developed. The use of cloud models in operational forecasting is discussed. Models to calculate the spatial scales of forcings and resultant response for mesoscale systems are presented. The use of these models to represent the climatologically most prevalent systems, and to perform case-by-case simulations is reviewed. Operational implementation of mesoscale data into weather forecasts, using both actual simulation output and method-output statistics is discussed.
A Bayesian Approach to Evaluating Consistency between Climate Model Output and Observations
NASA Astrophysics Data System (ADS)
Braverman, A. J.; Cressie, N.; Teixeira, J.
2010-12-01
Like other scientific and engineering problems that involve physical modeling of complex systems, climate models can be evaluated and diagnosed by comparing their output to observations of similar quantities. Though the global remote sensing data record is relatively short by climate research standards, these data offer opportunities to evaluate model predictions in new ways. For example, remote sensing data are spatially and temporally dense enough to provide distributional information that goes beyond simple moments to allow quantification of temporal and spatial dependence structures. In this talk, we propose a new method for exploiting these rich data sets using a Bayesian paradigm. For a collection of climate models, we calculate posterior probabilities its members best represent the physical system each seeks to reproduce. The posterior probability is based on the likelihood that a chosen summary statistic, computed from observations, would be obtained when the model's output is considered as a realization from a stochastic process. By exploring how posterior probabilities change with different statistics, we may paint a more quantitative and complete picture of the strengths and weaknesses of the models relative to the observations. We demonstrate our method using model output from the CMIP archive, and observations from NASA's Atmospheric Infrared Sounder.
Nutritional Status of Rural Older Adults is Linked to Physical and Emotional Health
Jung, Seung Eun; Bishop, Alex J; Kim, Minjung; Hermann, Janice; Kim, Giyeon; Lawrence, Jeannine
2017-01-01
Background Although nutritional status is influenced by multi-dimensional aspects encompassing physical and emotional well-being, there is limited research on this complex relationship. Objective The purpose of this study was to examine the interplay between indicators of physical health (perceived health status and self-care capacity) and emotional well-being (depressive affect and loneliness) on rural older adults’ nutritional status. Design The cross-sectional study was conducted from June 1, 2007 to June 1, 2008. Participants/setting A total of 171 community-dwelling older adults, 65 years and older, who resided within non-metro rural communities in the U.S. participated in this study. Main outcome measures Participants completed validated instruments measuring self-care capacity, perceived health status, loneliness, depressive affect, and nutritional status. Statistical analyses performed Structural equation modeling (SEM) was employed to investigate the complex interplay of physical and emotional health status with nutritional status among rural older adults, Chi-square statistic, CFI, RMSEA and SRMR were used to assess model fit. Results Chi-square statistic and the other model fit indices showed the hypothesized SEM model provided a good fit to the data (χ2 (2)=2.15, p=0.34; CFI=1.00; RMSEA=0.02; SRMR=0.03). Self-care capacity was significantly related with depressive affect (γ = −0.11, p=0.03) whereas self-care capacity was not significantly related with loneliness. Perceived health status had a significant negative relationship with both loneliness (γ = −0.16, p=0.03) and depressive affect (γ = −0.22, p=0.03). Although loneliness showed no significant direct relationship with nutritional status, it showed a significant direct relationship with depressive affect (β = 0.46, p<0.01). Finally, the results demonstrated that depressive affect had a significant negative relationship with nutritional status (β = −0.30, p<0.01). The results indicated physical health and emotional indicators have significant multi-dimensional associations with nutritional status among rural older adults. Conclusions The present study provides insights into the importance of addressing both physical and emotional well-being together to reduce potential effects of poor emotional well-being on nutritional status, particularly among rural older adults with impaired physical health and self-care capacity. PMID:28274787
Data-adaptive Harmonic Decomposition and Real-time Prediction of Arctic Sea Ice Extent
NASA Astrophysics Data System (ADS)
Kondrashov, Dmitri; Chekroun, Mickael; Ghil, Michael
2017-04-01
Decline in the Arctic sea ice extent (SIE) has profound socio-economic implications and is a focus of active scientific research. Of particular interest is prediction of SIE on subseasonal time scales, i.e. from early summer into fall, when sea ice coverage in Arctic reaches its minimum. However, subseasonal forecasting of SIE is very challenging due to the high variability of ocean and atmosphere over Arctic in summer, as well as shortness of observational data and inadequacies of the physics-based models to simulate sea-ice dynamics. The Sea Ice Outlook (SIO) by Sea Ice Prediction Network (SIPN, http://www.arcus.org/sipn) is a collaborative effort to facilitate and improve subseasonal prediction of September SIE by physics-based and data-driven statistical models. Data-adaptive Harmonic Decomposition (DAH) and Multilayer Stuart-Landau Models (MSLM) techniques [Chekroun and Kondrashov, 2017], have been successfully applied to the nonlinear stochastic modeling, as well as retrospective and real-time forecasting of Multisensor Analyzed Sea Ice Extent (MASIE) dataset in key four Arctic regions. In particular, DAH-MSLM predictions outperformed most statistical models and physics-based models in real-time 2016 SIO submissions. The key success factors are associated with DAH ability to disentangle complex regional dynamics of MASIE by data-adaptive harmonic spatio-temporal patterns that reduce the data-driven modeling effort to elemental MSLMs stacked per frequency with fixed and small number of model coefficients to estimate.
Novel approaches to the study of particle dark matter in astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Argüelles, C. R., E-mail: carlos.arguelles@icranet.org; Ruffini, R., E-mail: ruffini@icra.it; Rueda, J. A., E-mail: jorge.rueda@icra.it
A deep understanding of the role of the dark matter in the different astrophysical scenarios of the local Universe such as galaxies, represent a crucial step to describe in a more consistent way the role of dark matter in cosmology. This kind of studies requires the interconnection between particle physics within and beyond the Standard Model, and fundamental physics such as thermodynamics and statistics, within a fully relativistic treatment of Gravity. After giving a comprehensive summary of the different types of dark matter and their role in astrophysics, we discuss the recent efforts in describing the distribution of dark mattermore » in the center and halo of galaxies from first principles such as gravitational interactions, quantum statistics and particle physics; and its implications with the observations.« less
Spectral-Lagrangian methods for collisional models of non-equilibrium statistical states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamba, Irene M.; Tharkabhushanam, Sri Harsha
We propose a new spectral Lagrangian based deterministic solver for the non-linear Boltzmann transport equation (BTE) in d-dimensions for variable hard sphere (VHS) collision kernels with conservative or non-conservative binary interactions. The method is based on symmetries of the Fourier transform of the collision integral, where the complexity in its computation is reduced to a separate integral over the unit sphere S{sup d-1}. The conservation of moments is enforced by Lagrangian constraints. The resulting scheme, implemented in free space, is very versatile and adjusts in a very simple manner to several cases that involve energy dissipation due to local micro-reversibilitymore » (inelastic interactions) or elastic models of slowing down process. Our simulations are benchmarked with available exact self-similar solutions, exact moment equations and analytical estimates for the homogeneous Boltzmann equation, both for elastic and inelastic VHS interactions. Benchmarking of the simulations involves the selection of a time self-similar rescaling of the numerical distribution function which is performed using the continuous spectrum of the equation for Maxwell molecules as studied first in Bobylev et al. [A.V. Bobylev, C. Cercignani, G. Toscani, Proof of an asymptotic property of self-similar solutions of the Boltzmann equation for granular materials, Journal of Statistical Physics 111 (2003) 403-417] and generalized to a wide range of related models in Bobylev et al. [A.V. Bobylev, C. Cercignani, I.M. Gamba, On the self-similar asymptotics for generalized non-linear kinetic Maxwell models, Communication in Mathematical Physics, in press. URL: (
A Statistical Graphical Model of the California Reservoir System
NASA Astrophysics Data System (ADS)
Taeb, A.; Reager, J. T.; Turmon, M.; Chandrasekaran, V.
2017-11-01
The recent California drought has highlighted the potential vulnerability of the state's water management infrastructure to multiyear dry intervals. Due to the high complexity of the network, dynamic storage changes in California reservoirs on a state-wide scale have previously been difficult to model using either traditional statistical or physical approaches. Indeed, although there is a significant line of research on exploring models for single (or a small number of) reservoirs, these approaches are not amenable to a system-wide modeling of the California reservoir network due to the spatial and hydrological heterogeneities of the system. In this work, we develop a state-wide statistical graphical model to characterize the dependencies among a collection of 55 major California reservoirs across the state; this model is defined with respect to a graph in which the nodes index reservoirs and the edges specify the relationships or dependencies between reservoirs. We obtain and validate this model in a data-driven manner based on reservoir volumes over the period 2003-2016. A key feature of our framework is a quantification of the effects of external phenomena that influence the entire reservoir network. We further characterize the degree to which physical factors (e.g., state-wide Palmer Drought Severity Index (PDSI), average temperature, snow pack) and economic factors (e.g., consumer price index, number of agricultural workers) explain these external influences. As a consequence of this analysis, we obtain a system-wide health diagnosis of the reservoir network as a function of PDSI.
The Standard Model in the history of the Natural Sciences, Econometrics, and the social sciences
NASA Astrophysics Data System (ADS)
Fisher, W. P., Jr.
2010-07-01
In the late 18th and early 19th centuries, scientists appropriated Newton's laws of motion as a model for the conduct of any other field of investigation that would purport to be a science. This early form of a Standard Model eventually informed the basis of analogies for the mathematical expression of phenomena previously studied qualitatively, such as cohesion, affinity, heat, light, electricity, and magnetism. James Clerk Maxwell is known for his repeated use of a formalized version of this method of analogy in lectures, teaching, and the design of experiments. Economists transferring skills learned in physics made use of the Standard Model, especially after Maxwell demonstrated the value of conceiving it in abstract mathematics instead of as a concrete and literal mechanical analogy. Haavelmo's probability approach in econometrics and R. Fisher's Statistical Methods for Research Workers brought a statistical approach to bear on the Standard Model, quietly reversing the perspective of economics and the social sciences relative to that of physics. Where physicists, and Maxwell in particular, intuited scientific method as imposing stringent demands on the quality and interrelations of data, instruments, and theory in the name of inferential and comparative stability, statistical models and methods disconnected theory from data by removing the instrument as an essential component. New possibilities for reconnecting economics and the social sciences to Maxwell's sense of the method of analogy are found in Rasch's probabilistic models for measurement.
Modeling forest scenic beauty: Concepts and application to ponderosa pine
Thomas C. Brown; Terry C. Daniel
1984-01-01
Statistical models are presented which relate near-view scenic beauty of ponderosa pine stands in the Southwest to variables describing physical characteristics. The models suggest that herbage and large ponderosa pine contribute to scenic beauty, while numbers of small and intermediate-sized pine trees and downed wood, especially as slash, detract from scenic beauty....
Representation of the contextual statistical model by hyperbolic amplitudes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
We continue the development of a so-called contextual statistical model (here context has the meaning of a complex of physical conditions). It is shown that, besides contexts producing the conventional trigonometric cos-interference, there exist contexts producing the hyperbolic cos-interference. Starting with the corresponding interference formula of total probability we represent such contexts by hyperbolic probabilistic amplitudes or in the abstract formalism by normalized vectors of a hyperbolic analogue of the Hilbert space. There is obtained a hyperbolic Born's rule. Incompatible observables are represented by noncommutative operators. This paper can be considered as the first step towards hyperbolic quantum probability. Wemore » also discuss possibilities of experimental verification of hyperbolic quantum mechanics: in physics of elementary particles, string theory as well as in experiments with nonphysical systems, e.g., in psychology, cognitive sciences, and economy.« less
NASA Technical Reports Server (NTRS)
Smith, Eric A.; Mugnai, Alberto; Cooper, Harry J.; Tripoli, Gregory J.; Xiang, Xuwu
1992-01-01
The relationship between emerging microwave brightness temperatures (T(B)s) and vertically distributed mixtures of liquid and frozen hydrometeors was investigated, using a cloud-radiation model, in order to establish the framework for a hybrid statistical-physical rainfall retrieval algorithm. Although strong relationships were found between the T(B) values and various rain parameters, these correlations are misleading in that the T(B)s are largely controlled by fluctuations in the ice-particle mixing ratios, which in turn are highly correlated to fluctuations in liquid-particle mixing ratios. However, the empirically based T(B)-rain-rate (T(B)-RR) algorithms can still be used as tools for estimating precipitation if the hydrometeor profiles used for T(B)-RR algorithms are not specified in an ad hoc fashion.
Representation of the contextual statistical model by hyperbolic amplitudes
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2005-06-01
We continue the development of a so-called contextual statistical model (here context has the meaning of a complex of physical conditions). It is shown that, besides contexts producing the conventional trigonometric cos-interference, there exist contexts producing the hyperbolic cos-interference. Starting with the corresponding interference formula of total probability we represent such contexts by hyperbolic probabilistic amplitudes or in the abstract formalism by normalized vectors of a hyperbolic analogue of the Hilbert space. There is obtained a hyperbolic Born's rule. Incompatible observables are represented by noncommutative operators. This paper can be considered as the first step towards hyperbolic quantum probability. We also discuss possibilities of experimental verification of hyperbolic quantum mechanics: in physics of elementary particles, string theory as well as in experiments with nonphysical systems, e.g., in psychology, cognitive sciences, and economy.
NASA Astrophysics Data System (ADS)
Tenenbaum, Joel
This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be reproduced in observing the growth rates of the usage of individual words, that just as companies compete for sales in a zero sum marketing game, so do words compete for usage within a limited amount of reader man-hours.
NASA Astrophysics Data System (ADS)
Singh Pradhan, Ananta Man; Kang, Hyo-Sub; Kim, Yun-Tae
2016-04-01
This study uses a physically based approach to evaluate the factor of safety of the hillslope for different hydrological conditions, in Mt Umyeon, south of Seoul. The hydrological conditions were determined using intensity and duration of whole Korea of known landslide inventory data. Quantile regression statistical method was used to ascertain different probability warning levels on the basis of rainfall thresholds. Physically based models are easily interpreted and have high predictive capabilities but rely on spatially explicit and accurate parameterization, which is commonly not possible. Statistical probabilistic methods can include other causative factors which influence the slope stability such as forest, soil and geology, but rely on good landslide inventories of the site. In this study a hybrid approach has described that combines the physically-based landslide susceptibility for different hydrological conditions. A presence-only based maximum entropy model was used to hybrid and analyze relation of landslide with conditioning factors. About 80% of the landslides were listed among the unstable sites identified in the proposed model, thereby presenting its effectiveness and accuracy in determining unstable areas and areas that require evacuation. These cumulative rainfall thresholds provide a valuable reference to guide disaster prevention authorities in the issuance of warning levels with the ability to reduce losses and save lives.
Ea, Vuthy; Sexton, Tom; Gostan, Thierry; Herviou, Laurie; Baudement, Marie-Odile; Zhang, Yunzhe; Berlivet, Soizik; Le Lay-Taha, Marie-Noëlle; Cathala, Guy; Lesne, Annick; Victor, Jean-Marc; Fan, Yuhong; Cavalli, Giacomo; Forné, Thierry
2015-08-15
In higher eukaryotes, the genome is partitioned into large "Topologically Associating Domains" (TADs) in which the chromatin displays favoured long-range contacts. While a crumpled/fractal globule organization has received experimental supports at higher-order levels, the organization principles that govern chromatin dynamics within these TADs remain unclear. Using simple polymer models, we previously showed that, in mouse liver cells, gene-rich domains tend to adopt a statistical helix shape when no significant locus-specific interaction takes place. Here, we use data from diverse 3C-derived methods to explore chromatin dynamics within mouse and Drosophila TADs. In mouse Embryonic Stem Cells (mESC), that possess large TADs (median size of 840 kb), we show that the statistical helix model, but not globule models, is relevant not only in gene-rich TADs, but also in gene-poor and gene-desert TADs. Interestingly, this statistical helix organization is considerably relaxed in mESC compared to liver cells, indicating that the impact of the constraints responsible for this organization is weaker in pluripotent cells. Finally, depletion of histone H1 in mESC alters local chromatin flexibility but not the statistical helix organization. In Drosophila, which possesses TADs of smaller sizes (median size of 70 kb), we show that, while chromatin compaction and flexibility are finely tuned according to the epigenetic landscape, chromatin dynamics within TADs is generally compatible with an unconstrained polymer configuration. Models issued from polymer physics can accurately describe the organization principles governing chromatin dynamics in both mouse and Drosophila TADs. However, constraints applied on this dynamics within mammalian TADs have a peculiar impact resulting in a statistical helix organization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thejas, Urs G.; Somashekar, R., E-mail: rs@physics.uni-mysore.ac.in; Sangappa, Y.
A stochastic approach to explain the variation of physical parameters in polymer composites is discussed in this study. We have given a statistical model to derive the characteristic variation of physical parameters as a function of dopant concentration. Results of X-ray diffraction study and conductivity have been taken to validate this function, which can be extended to any of the physical parameters and polymer composites. For this study we have considered a polymer composites of HPMC doped with various concentrations of Nickel Chloride.
Impact of baryonic physics on intrinsic alignments
Tenneti, Ananth; Gnedin, Nickolay Y.; Feng, Yu
2017-01-11
We explore the effects of specific assumptions in the subgrid models of star formation and stellar and AGN feedback on intrinsic alignments of galaxies in cosmological simulations of "MassiveBlack-II" family. Using smaller volume simulations, we explored the parameter space of the subgrid star formation and feedback model and found remarkable robustness of the observable statistical measures to the details of subgrid physics. The one observational probe most sensitive to modeling details is the distribution of misalignment angles. We hypothesize that the amount of angular momentum carried away by the galactic wind is the primary physical quantity that controls the orientationmore » of the stellar distribution. Finally, our results are also consistent with a similar study by the EAGLE simulation team.« less
NASA Astrophysics Data System (ADS)
Zheng, Xu; Hao, Zhiyong; Wang, Xu; Mao, Jie
2016-06-01
High-speed-railway-train interior noise at low, medium, and high frequencies could be simulated by finite element analysis (FEA) or boundary element analysis (BEA), hybrid finite element analysis-statistical energy analysis (FEA-SEA) and statistical energy analysis (SEA), respectively. First, a new method named statistical acoustic energy flow (SAEF) is proposed, which can be applied to the full-spectrum HST interior noise simulation (including low, medium, and high frequencies) with only one model. In an SAEF model, the corresponding multi-physical-field coupling excitations are firstly fully considered and coupled to excite the interior noise. The interior noise attenuated by sound insulation panels of carriage is simulated through modeling the inflow acoustic energy from the exterior excitations into the interior acoustic cavities. Rigid multi-body dynamics, fast multi-pole BEA, and large-eddy simulation with indirect boundary element analysis are first employed to extract the multi-physical-field excitations, which include the wheel-rail interaction forces/secondary suspension forces, the wheel-rail rolling noise, and aerodynamic noise, respectively. All the peak values and their frequency bands of the simulated acoustic excitations are validated with those from the noise source identification test. Besides, the measured equipment noise inside equipment compartment is used as one of the excitation sources which contribute to the interior noise. Second, a full-trimmed FE carriage model is firstly constructed, and the simulated modal shapes and frequencies agree well with the measured ones, which has validated the global FE carriage model as well as the local FE models of the aluminum alloy-trim composite panel. Thus, the sound transmission loss model of any composite panel has indirectly been validated. Finally, the SAEF model of the carriage is constructed based on the accurate FE model and stimulated by the multi-physical-field excitations. The results show that the trend of the simulated 1/3 octave band sound pressure spectrum agrees well with that of the on-site-measured one. The deviation between the simulated and measured overall sound pressure level (SPL) is 2.6 dB(A) and well controlled below the engineering tolerance limit, which has validated the SAEF model in the full-spectrum analysis of the high speed train interior noise.
Selected topics in high energy physics: Flavon, neutrino and extra-dimensional models
NASA Astrophysics Data System (ADS)
Dorsner, Ilja
There is already significant evidence, both experimental and theoretical, that the Standard Model of elementary particle physics is just another effective physical theory. Thus, it is crucial (a) to anticipate the experiments in search for signatures of the physics beyond the Standard Model, and (b) whether some theoretically preferred structure can reproduce the low-energy signature of the Standard Model. This work pursues these two directions by investigating various extensions of the Standard Model. One of them is a simple flavon model that accommodates the observed hierarchy of the charged fermion masses and mixings. We show that flavor changing and CP violating signatures of this model are equally near the present experimental limits. We find that, for a significant range of parameters, mu-e conversion can be the most sensitive place to look for such signatures. We then propose two variants of an SO(10) model in five-dimensional framework. The first variant demonstrates that one can embed a four-dimensional flipped SU(5) model into a five-dimensional SO(10) model. This allows one to maintain the advantages of flipped SU(5) while avoiding its well-known drawbacks. The second variant shows that exact unification of the gauge couplings is possible even in the higher dimensional setting. This unification yields low-energy values of the gauge couplings that are in a perfect agreement with experimental values. We show that the corrections to the usual four-dimensional running, due to the Kaluza-Klein towers of states, can be unambiguously and systematically evaluated. We also consider the various main types of models of neutrino masses and mixings from the point of view of how naturally they give the large mixing angle MSW solution to the solar neutrino problem. Special attention is given to one particular "lopsided" SU(5) model, which is then analyzed in a completely statistical manner. We suggest that this sort of statistical analysis should be applicable to other models of neutrino mixing.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
NASA Astrophysics Data System (ADS)
Lu, Nianduan; Li, Ling; Sun, Pengxiao; Banerjee, Writam; Liu, Ming
2014-09-01
A unified physical model for Seebeck coefficient was presented based on the multiple-trapping and release theory for amorphous oxide semiconductor thin-film transistors. According to the proposed model, the Seebeck coefficient is attributed to the Fermi-Dirac statistics combined with the energy dependent trap density of states and the gate-voltage dependence of the quasi-Fermi level. The simulation results show that the gate voltage, energy disorder, and temperature dependent Seebeck coefficient can be well described. The calculation also shows a good agreement with the experimental data in amorphous In-Ga-Zn-O thin-film transistor.
SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.
2013-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.
A statistical mechanics model for free-for-all airplane passenger boarding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steffen, Jason H.; /Fermilab
2008-08-01
I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. Themore » model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty.« less
Reciprocity in directed networks
NASA Astrophysics Data System (ADS)
Yin, Mei; Zhu, Lingjiong
2016-04-01
Reciprocity is an important characteristic of directed networks and has been widely used in the modeling of World Wide Web, email, social, and other complex networks. In this paper, we take a statistical physics point of view and study the limiting entropy and free energy densities from the microcanonical ensemble, the canonical ensemble, and the grand canonical ensemble whose sufficient statistics are given by edge and reciprocal densities. The sparse case is also studied for the grand canonical ensemble. Extensions to more general reciprocal models including reciprocal triangle and star densities will likewise be discussed.
Health-Related Quality-of-Life Findings for the Prostate Cancer Prevention Trial
2012-01-01
Background The Prostate Cancer Prevention Trial (PCPT)—a randomized placebo-controlled study of the efficacy of finasteride in preventing prostate cancer—offered the opportunity to prospectively study effects of finasteride and other covariates on the health-related quality of life of participants in a multiyear trial. Methods We assessed three health-related quality-of-life domains (measured with the Health Survey Short Form–36: Physical Functioning, Mental Health, and Vitality scales) via questionnaires completed by PCPT participants at enrollment (3 months before randomization), at 6 months after randomization, and annually for 7 years. Covariate data obtained at enrollment from patient-completed questionnaires were included in our model. Mixed-effects model analyses and a cross-sectional presentation at three time points began at 6 months after randomization. All statistical tests were two-sided. Results For the physical function outcome (n = 16 077), neither the finasteride main effect nor the finasteride interaction with time were statistically significant. The effects of finasteride on physical function were minor and accounted for less than a 1-point difference over time in Physical Functioning scores (mixed-effect estimate = 0.07, 95% confidence interval [CI] = −0.28 to 0.42, P = .71). Comorbidities such as congestive heart failure (estimate = −5.64, 95% CI = −7.96 to −3.32, P < .001), leg pain (estimate = −2.57, 95% CI = −3.04 to −2.10, P < .001), and diabetes (estimate = −1.31, 95% CI = −2.04 to −0.57, P < .001) had statistically significant negative effects on physical function, as did current smoking (estimate = −2.34, 95% CI = −2.97 to −1.71, P < .001) and time on study (estimate = −1.20, 95% CI = −1.36 to −1.03, P < .001). Finasteride did not have a statistically significant effect on the other two dependent variables, mental health and vitality, either in the mixed-effects analyses or in the cross-sectional analysis at any of the three time points. Conclusion Finasteride did not negatively affect SF–36 Physical Functioning, Mental Health, or Vitality scores. PMID:22972968
NASA Astrophysics Data System (ADS)
Shvartsburg, Alexandre A.; Siu, K. W. Michael
2001-06-01
Modeling the delayed dissociation of clusters had been over the last decade a frontline development area in chemical physics. It is of fundamental interest how statistical kinetics methods previously validated for regular molecules and atomic nuclei may apply to clusters, as this would help to understand the transferability of statistical models for disintegration of complex systems across various classes of physical objects. From a practical perspective, accurate simulation of unimolecular decomposition is critical for the extraction of true thermochemical values from measurements on the decay of energized clusters. Metal clusters are particularly challenging because of the multitude of low-lying electronic states that are coupled to vibrations. This has previously been accounted for assuming the average electronic structure of a conducting cluster approximated by the levels of electron in a cavity. While this provides a reasonable time-averaged description, it ignores the distribution of instantaneous electronic structures in a "boiling" cluster around that average. Here we set up a new treatment that incorporates the statistical distribution of electronic levels around the average picture using random matrix theory. This approach faithfully reflects the completely chaotic "vibronic soup" nature of hot metal clusters. We found that the consideration of electronic level statistics significantly promotes electronic excitation and thus increases the magnitude of its effect. As this excitation always depresses the decay rates, the inclusion of level statistics results in slower dissociation of metal clusters.
Applications of statistical physics to the social and economic sciences
NASA Astrophysics Data System (ADS)
Petersen, Alexander M.
2011-12-01
This thesis applies statistical physics concepts and methods to quantitatively analyze socioeconomic systems. For each system we combine theoretical models and empirical data analysis in order to better understand the real-world system in relation to the complex interactions between the underlying human agents. This thesis is separated into three parts: (i) response dynamics in financial markets, (ii) dynamics of career trajectories, and (iii) a stochastic opinion model with quenched disorder. In Part I we quantify the response of U.S. markets to financial shocks, which perturb markets and trigger "herding behavior" among traders. We use concepts from earthquake physics to quantify the decay of volatility shocks after the "main shock." We also find, surprisingly, that we can make quantitative statements even before the main shock. In order to analyze market behavior before as well as after "anticipated news" we use Federal Reserve interest-rate announcements, which are regular events that are also scheduled in advance. In Part II we analyze the statistical physics of career longevity. We construct a stochastic model for career progress which has two main ingredients: (a) random forward progress in the career and (b) random termination of the career. We incorporate the rich-get-richer (Matthew) effect into ingredient (a), meaning that it is easier to move forward in the career the farther along one is in the career. We verify the model predictions analyzing data on 400,000 scientific careers and 20,000 professional sports careers. Our model highlights the importance of early career development, showing that many careers are stunted by the relative disadvantage associated with inexperience. In Part III we analyze a stochastic two-state spin model which represents a system of voters embedded on a network. We investigate the role in consensus formation of "zealots", which are agents with time-independent opinion. Our main result is the unexpected finding that it is the number and not the density of zealots which deter- mines the steady-state opinion polarization. We compare our findings with results for United States Presidential elections.
Machine learning to analyze images of shocked materials for precise and accurate measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dresselhaus-Cooper, Leora; Howard, Marylesa; Hock, Margaret C.
A supervised machine learning algorithm, called locally adaptive discriminant analysis (LADA), has been developed to locate boundaries between identifiable image features that have varying intensities. LADA is an adaptation of image segmentation, which includes techniques that find the positions of image features (classes) using statistical intensity distributions for each class in the image. In order to place a pixel in the proper class, LADA considers the intensity at that pixel and the distribution of intensities in local (nearby) pixels. This paper presents the use of LADA to provide, with statistical uncertainties, the positions and shapes of features within ultrafast imagesmore » of shock waves. We demonstrate the ability to locate image features including crystals, density changes associated with shock waves, and material jetting caused by shock waves. This algorithm can analyze images that exhibit a wide range of physical phenomena because it does not rely on comparison to a model. LADA enables analysis of images from shock physics with statistical rigor independent of underlying models or simulations.« less
Inverse statistical physics of protein sequences: a key issues review.
Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin
2018-03-01
In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.
Inverse statistical physics of protein sequences: a key issues review
NASA Astrophysics Data System (ADS)
Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin
2018-03-01
In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.
Biocultural approach of the association between maturity and physical activity in youth.
Werneck, André O; Silva, Danilo R; Collings, Paul J; Fernandes, Rômulo A; Ronque, Enio R V; Coelho-E-Silva, Manuel J; Sardinha, Luís B; Cyrino, Edilson S
2017-11-13
To test the biocultural model through direct and indirect associations between biological maturation, adiposity, cardiorespiratory fitness, feelings of sadness, social relationships, and physical activity in adolescents. This was a cross-sectional study conducted with 1,152 Brazilian adolescents aged between 10 and 17 years. Somatic maturation was estimated through Mirwald's method (peak height velocity). Physical activity was assessed through Baecke questionnaire (occupational, leisure, and sport contexts). Body mass index, body fat (sum of skinfolds), cardiorespiratory fitness (20-m shuttle run test), self-perceptions of social relationship, and frequency of sadness feelings were obtained for statistical modeling. Somatic maturation is directly related to sport practice and leisure time physical activity only among girls (β=0.12, p<0.05 and β=0.09, respectively, p<0.05). Moreover, biological (adiposity and cardiorespiratory fitness), psychological (sadness), and social (satisfaction with social relationships) variables mediated the association between maturity and physical activity in boys and for occupational physical activity in girls. In general, models presented good fit coefficients. Biocultural model presents good fit and emotional/biological factors mediate part of the relationship between somatic maturation and physical activity. Copyright © 2017 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
Louis R. Iverson; Frank R. Thompson; Stephen Matthews; Matthew Peters; Anantha Prasad; William D. Dijak; Jacob Fraser; Wen J. Wang; Brice Hanberry; Hong He; Maria Janowiak; Patricia Butler; Leslie Brandt; Chris Swanston
2016-01-01
Context. Species distribution models (SDM) establish statistical relationships between the current distribution of species and key attributes whereas process-based models simulate ecosystem and tree species dynamics based on representations of physical and biological processes. TreeAtlas, which uses DISTRIB SDM, and Linkages and LANDIS PRO, process...
A Complex Network Approach to Stylometry
Amancio, Diego Raphael
2015-01-01
Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921
Colloquium: Physics of the Riemann hypothesis
NASA Astrophysics Data System (ADS)
Schumayer, Dániel; Hutchinson, David A. W.
2011-04-01
Physicists become acquainted with special functions early in their studies. Consider our perennial model, the harmonic oscillator, for which we need Hermite functions, or the Laguerre functions in quantum mechanics. Here a particular number-theoretical function is chosen, the Riemann zeta function, and its influence on the realm of physics is examined and also how physics may be suggestive for the resolution of one of mathematics’ most famous unconfirmed conjectures, the Riemann hypothesis. Does physics hold an essential key to the solution for this more than 100-year-old problem? In this work numerous models from different branches of physics are examined, from classical mechanics to statistical physics, where this function plays an integral role. This function is also shown to be related to quantum chaos and how its pole structure encodes when particles can undergo Bose-Einstein condensation at low temperature. Throughout these examinations light is shed on how the Riemann hypothesis can highlight physics. Naturally, the aim is not to be comprehensive, but rather focusing on the major models and aim to give an informed starting point for the interested reader.
High-temperature behavior of a deformed Fermi gas obeying interpolating statistics.
Algin, Abdullah; Senay, Mustafa
2012-04-01
An outstanding idea originally introduced by Greenberg is to investigate whether there is equivalence between intermediate statistics, which may be different from anyonic statistics, and q-deformed particle algebra. Also, a model to be studied for addressing such an idea could possibly provide us some new consequences about the interactions of particles as well as their internal structures. Motivated mainly by this idea, in this work, we consider a q-deformed Fermi gas model whose statistical properties enable us to effectively study interpolating statistics. Starting with a generalized Fermi-Dirac distribution function, we derive several thermostatistical functions of a gas of these deformed fermions in the thermodynamical limit. We study the high-temperature behavior of the system by analyzing the effects of q deformation on the most important thermostatistical characteristics of the system such as the entropy, specific heat, and equation of state. It is shown that such a deformed fermion model in two and three spatial dimensions exhibits the interpolating statistics in a specific interval of the model deformation parameter 0 < q < 1. In particular, for two and three spatial dimensions, it is found from the behavior of the third virial coefficient of the model that the deformation parameter q interpolates completely between attractive and repulsive systems, including the free boson and fermion cases. From the results obtained in this work, we conclude that such a model could provide much physical insight into some interacting theories of fermions, and could be useful to further study the particle systems with intermediate statistics.
Grotjahn, Richard; Black, Robert; Leung, Ruby; ...
2015-05-22
This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less
Vieira, Rute; McDonald, Suzanne; Araújo-Soares, Vera; Sniehotta, Falko F; Henderson, Robin
2017-09-01
N-of-1 studies are based on repeated observations within an individual or unit over time and are acknowledged as an important research method for generating scientific evidence about the health or behaviour of an individual. Statistical analyses of n-of-1 data require accurate modelling of the outcome while accounting for its distribution, time-related trend and error structures (e.g., autocorrelation) as well as reporting readily usable contextualised effect sizes for decision-making. A number of statistical approaches have been documented but no consensus exists on which method is most appropriate for which type of n-of-1 design. We discuss the statistical considerations for analysing n-of-1 studies and briefly review some currently used methodologies. We describe dynamic regression modelling as a flexible and powerful approach, adaptable to different types of outcomes and capable of dealing with the different challenges inherent to n-of-1 statistical modelling. Dynamic modelling borrows ideas from longitudinal and event history methodologies which explicitly incorporate the role of time and the influence of past on future. We also present an illustrative example of the use of dynamic regression on monitoring physical activity during the retirement transition. Dynamic modelling has the potential to expand researchers' access to robust and user-friendly statistical methods for individualised studies.
NASA Astrophysics Data System (ADS)
Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas
2015-04-01
The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.
NASA Astrophysics Data System (ADS)
Tellman, B.; Schwarz, B.
2014-12-01
This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.
State-Transition Structures in Physics and in Computation
NASA Astrophysics Data System (ADS)
Petri, C. A.
1982-12-01
In order to establish close connections between physical and computational processes, it is assumed that the concepts of “state” and of “transition” are acceptable both to physicists and to computer scientists, at least in an informal way. The aim of this paper is to propose formal definitions of state and transition elements on the basis of very low level physical concepts in such a way that (1) all physically possible computations can be described as embedded in physical processes; (2) the computational aspects of physical processes can be described on a well-defined level of abstraction; (3) the gulf between the continuous models of physics and the discrete models of computer science can be bridged by simple mathematical constructs which may be given a physical interpretation; (4) a combinatorial, nonstatistical definition of “information” can be given on low levels of abstraction which may serve as a basis to derive higher-level concepts of information, e.g., by a statistical or probabilistic approach. Conceivable practical consequences are discussed.
NASA Astrophysics Data System (ADS)
Sandfeld, Stefan; Budrikis, Zoe; Zapperi, Stefano; Fernandez Castellanos, David
2015-02-01
Crystalline plasticity is strongly interlinked with dislocation mechanics and nowadays is relatively well understood. Concepts and physical models of plastic deformation in amorphous materials on the other hand—where the concept of linear lattice defects is not applicable—still are lagging behind. We introduce an eigenstrain-based finite element lattice model for simulations of shear band formation and strain avalanches. Our model allows us to study the influence of surfaces and finite size effects on the statistics of avalanches. We find that even with relatively complex loading conditions and open boundary conditions, critical exponents describing avalanche statistics are unchanged, which validates the use of simpler scalar lattice-based models to study these phenomena.
NASA Astrophysics Data System (ADS)
Vavylonis, Dimitrios
2009-03-01
I will describe my experience in developing an interdisciplinary biophysics course addressed to students at the upper undergraduate and graduate level, in collaboration with colleagues in physics and biology. The students had a background in physics, biology and engineering, and for many the course was their first exposure to interdisciplinary topics. The course did not depend on a formal knowledge of equilibrium statistical mechanics. Instead, the approach was based on dynamics. I used diffusion as a universal ``long time'' law to illustrate scaling concepts. The importance of statistics and proper counting of states/paths was introduced by calculating the maximum accuracy with which bacteria can measure the concentration of diffuse chemicals. The use of quantitative concepts and methods was introduced through specific biological examples, focusing on model organisms and extremes at the cell level. Examples included microtubule dynamic instability, the search and capture model, molecular motor cooperativity in muscle cells, mitotic spindle oscillations in C. elegans, polymerization forces and propulsion of pathogenic bacteria, Brownian ratchets, bacterial cell division and MinD oscillations.
Rasch Model Based Analysis of the Force Concept Inventory
ERIC Educational Resources Information Center
Planinic, Maja; Ivanjek, Lana; Susac, Ana
2010-01-01
The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…
Progress in the improved lattice calculation of direct CP-violation in the Standard Model
NASA Astrophysics Data System (ADS)
Kelly, Christopher
2018-03-01
We discuss the ongoing effort by the RBC & UKQCD collaborations to improve our lattice calculation of the measure of Standard Model direct CP violation, ɛ', with physical kinematics. We present our progress in decreasing the (dominant) statistical error and discuss other related activities aimed at reducing the systematic errors.
Reduced Order Modeling Methods for Turbomachinery Design
2009-03-01
and Ma- terials Conference, May 2006. [45] A. Gelman , J. B. Carlin, H. S. Stern, and D. B. Rubin, Bayesian Data Analysis. New York, NY: Chapman I& Hall...Macian- Juan , and R. Chawla, “A statistical methodology for quantif ca- tion of uncertainty in best estimate code physical models,” Annals of Nuclear En
NASA Astrophysics Data System (ADS)
Kariniotakis, G.; Anemos Team
2003-04-01
Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.
Thomas-Fermi model for a bulk self-gravitating stellar object in two dimensions
NASA Astrophysics Data System (ADS)
De, Sanchari; Chakrabarty, Somenath
2015-09-01
In this article we have solved a hypothetical problem related to the stability and gross properties of two-dimensional self-gravitating stellar objects using the Thomas-Fermi model. The formalism presented here is an extension of the standard three-dimensional problem discussed in the book on statistical physics, Part I by Landau and Lifshitz. Further, the formalism presented in this article may be considered a class problem for post-graduate-level students of physics or may be assigned as a part of their dissertation project.
SPIPS: Spectro-Photo-Interferometry of Pulsating Stars
NASA Astrophysics Data System (ADS)
Mérand, Antoine
2017-10-01
SPIPS (Spectro-Photo-Interferometry of Pulsating Stars) combines radial velocimetry, interferometry, and photometry to estimate physical parameters of pulsating stars, including presence of infrared excess, color excess, Teff, and ratio distance/p-factor. The global model-based parallax-of-pulsation method is implemented in Python. Derived parameters have a high level of confidence; statistical precision is improved (compared to other methods) due to the large number of data taken into account, accuracy is improved by using consistent physical modeling and reliability of the derived parameters is strengthened by redundancy in the data.
Euler equation existence, non-uniqueness and mesh converged statistics
Glimm, James; Sharp, David H.; Lim, Hyunkyung; Kaufman, Ryan; Hu, Wenlin
2015-01-01
We review existence and non-uniqueness results for the Euler equation of fluid flow. These results are placed in the context of physical models and their solutions. Non-uniqueness is in direct conflict with the purpose of practical simulations, so that a mitigating strategy, outlined here, is important. We illustrate these issues in an examination of mesh converged turbulent statistics, with comparison to laboratory experiments. PMID:26261361
Physics of Electronic Materials
NASA Astrophysics Data System (ADS)
Rammer, Jørgen
2017-03-01
1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.
Närhi, Mikko; Wetzel, Benjamin; Billet, Cyril; Toenger, Shanti; Sylvestre, Thibaut; Merolla, Jean-Marc; Morandotti, Roberto; Dias, Frederic; Genty, Goëry; Dudley, John M.
2016-01-01
Modulation instability is a fundamental process of nonlinear science, leading to the unstable breakup of a constant amplitude solution of a physical system. There has been particular interest in studying modulation instability in the cubic nonlinear Schrödinger equation, a generic model for a host of nonlinear systems including superfluids, fibre optics, plasmas and Bose–Einstein condensates. Modulation instability is also a significant area of study in the context of understanding the emergence of high amplitude events that satisfy rogue wave statistical criteria. Here, exploiting advances in ultrafast optical metrology, we perform real-time measurements in an optical fibre system of the unstable breakup of a continuous wave field, simultaneously characterizing emergent modulation instability breather pulses and their associated statistics. Our results allow quantitative comparison between experiment, modelling and theory, and are expected to open new perspectives on studies of instability dynamics in physics. PMID:27991513
NASA Astrophysics Data System (ADS)
Mfumu Kihumba, Antoine; Vanclooster, Marnik
2013-04-01
Drinking water in Kinshasa, the capital of the Democratic Republic of Congo, is provided by extracting groundwater from the local aquifer, particularly in peripheral areas. The exploited groundwater body is mainly unconfined and located within a continuous detrital aquifer, primarily composed of sedimentary formations. However, the aquifer is subjected to an increasing threat of anthropogenic pollution pressure. Understanding the detailed origin of this pollution pressure is important for sustainable drinking water management in Kinshasa. The present study aims to explain the observed nitrate pollution problem, nitrate being considered as a good tracer for other pollution threats. The analysis is made in terms of physical attributes that are readily available using a statistical modelling approach. For the nitrate data, use was made of a historical groundwater quality assessment study, for which the data were re-analysed. The physical attributes are related to the topography, land use, geology and hydrogeology of the region. Prior to the statistical modelling, intrinsic and specific vulnerability for nitrate pollution was assessed. This vulnerability assessment showed that the alluvium area in the northern part of the region is the most vulnerable area. This area consists of urban land use with poor sanitation. Re-analysis of the nitrate pollution data demonstrated that the spatial variability of nitrate concentrations in the groundwater body is high, and coherent with the fragmented land use of the region and the intrinsic and specific vulnerability maps. For the statistical modeling use was made of multiple regression and regression tree analysis. The results demonstrated the significant impact of land use variables on the Kinshasa groundwater nitrate pollution and the need for a detailed delineation of groundwater capture zones around the monitoring stations. Key words: Groundwater , Isotopic, Kinshasa, Modelling, Pollution, Physico-chemical.
Saunders, Ruth P.; McIver, Kerry L.; Dowda, Marsha; Pate, Russell R.
2013-01-01
Objective Scales used to measure selected social-cognitive beliefs and motives for physical activity were tested among boys and girls. Methods Covariance modeling was applied to responses obtained from large multi-ethnic samples of students in the fifth and sixth grades. Results Theoretically and statistically sound models were developed, supporting the factorial validity of the scales in all groups. Multi-group longitudinal invariance was confirmed between boys and girls, overweight and normal weight students, and non-Hispanic black and white children. The construct validity of the scales was supported by hypothesized convergent and discriminant relationships within a measurement model that included correlations with physical activity (MET • min/day) measured by an accelerometer. Conclusions Scores from the scales provide valid assessments of selected beliefs and motives that are putative mediators of change in physical activity among boys and girls, as they begin the understudied transition from the fifth grade into middle school, when physical activity naturally declines. PMID:23459310
Dishman, Rod K; Saunders, Ruth P; McIver, Kerry L; Dowda, Marsha; Pate, Russell R
2013-06-01
Scales used to measure selected social-cognitive beliefs and motives for physical activity were tested among boys and girls. Covariance modeling was applied to responses obtained from large multi-ethnic samples of students in the fifth and sixth grades. Theoretically and statistically sound models were developed, supporting the factorial validity of the scales in all groups. Multi-group longitudinal invariance was confirmed between boys and girls, overweight and normal weight students, and non-Hispanic black and white children. The construct validity of the scales was supported by hypothesized convergent and discriminant relationships within a measurement model that included correlations with physical activity (MET • min/day) measured by an accelerometer. Scores from the scales provide valid assessments of selected beliefs and motives that are putative mediators of change in physical activity among boys and girls, as they begin the understudied transition from the fifth grade into middle school, when physical activity naturally declines.
Worrying trends in econophysics
NASA Astrophysics Data System (ADS)
Gallegati, Mauro; Keen, Steve; Lux, Thomas; Ormerod, Paul
2006-10-01
Econophysics has already made a number of important empirical contributions to our understanding of the social and economic world. These fall mainly into the areas of finance and industrial economics, where in each case there is a large amount of reasonably well-defined data. More recently, Econophysics has also begun to tackle other areas of economics where data is much more sparse and much less reliable. In addition, econophysicists have attempted to apply the theoretical approach of statistical physics to try to understand empirical findings. Our concerns are fourfold. First, a lack of awareness of work that has been done within economics itself. Second, resistance to more rigorous and robust statistical methodology. Third, the belief that universal empirical regularities can be found in many areas of economic activity. Fourth, the theoretical models which are being used to explain empirical phenomena. The latter point is of particular concern. Essentially, the models are based upon models of statistical physics in which energy is conserved in exchange processes. There are examples in economics where the principle of conservation may be a reasonable approximation to reality, such as primitive hunter-gatherer societies. But in the industrialised capitalist economies, income is most definitely not conserved. The process of production and not exchange is responsible for this. Models which focus purely on exchange and not on production cannot by definition offer a realistic description of the generation of income in the capitalist, industrialised economies.
Romain, Ahmed Jerôme; Bernard, Paquito; Hokayem, Marie; Gernigon, Christophe; Avignon, Antoine
2016-03-01
This study aimed to test three factorial structures conceptualizing the processes of change (POC) from the transtheoretical model and to examine the relationships between the POC and stages of change (SOC) among overweight and obese adults. Cross-sectional study. This study was conducted at the University Hospital of Montpellier, France. A sample of 289 overweight or obese participants (199 women) was enrolled in the study. Participants completed the POC and SOC questionnaires during a 5-day hospitalization for weight management. Structural equation modeling was used to compare the different factorial structures. The unweighted least-squares method was used to identify the best-fit indices for the five fully correlated model (goodness-of-fit statistic = .96; adjusted goodness-of-fit statistic = .95; standardized root mean residual = .062; normed-fit index = .95; parsimonious normed-fit index = .83; parsimonious goodness-of-fit statistic = .78). The multivariate analysis of variance was significant (p < .001). A post hoc test showed that individuals in advanced SOC used more of both experiential and behavioral POC than those in preaction stages, with effect sizes ranging from .06 to .29. This study supports the validity of the factorial structure of POC concerning physical activity and confirms the assumption that, in this context, people with excess weight use both experiential and behavioral processes. These preliminary results should be confirmed in a longitudinal study. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Mori, Kaya; Chonko, James C.; Hailey, Charles J.
2005-10-01
We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.
Model Independent Search For New Physics At The Tevatron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudalakis, Georgios
2008-04-01
The Standard Model of elementary particles can not be the final theory. There are theoretical reasons to expect the appearance of new physics, possibly at the energy scale of few TeV. Several possible theories of new physics have been proposed, each with unknown probability to be confirmed. Instead of arbitrarily choosing to examine one of those theories, this thesis is about searching for any sign of new physics in a model-independent way. This search is performed at the Collider Detector at Fermilab (CDF). The Standard Model prediction is implemented in all final states simultaneously, and an array of statistical probesmore » is employed to search for significant discrepancies between data and prediction. The probes are sensitive to overall population discrepancies, shape disagreements in distributions of kinematic quantities of final particles, excesses of events of large total transverse momentum, and local excesses of data expected from resonances due to new massive particles. The result of this search, first in 1 fb -1 and then in 2 fb -1, is null, namely no considerable evidence of new physics was found.« less
Lynch, Brigid M; Dunstan, David W; Healy, Genevieve N; Winkler, Elisabeth; Eakin, Elizabeth; Owen, Neville
2010-02-01
Obesity and physical inactivity are poor prognostic indicators for breast cancer. Studies to date have relied on self-report measures of physical activity, which tend mainly to assess moderate-to-vigorous intensity leisure-time physical activity. We report the cross-sectional associations of objectively assessed physical activity and sedentary time with adiposity in a sample of breast cancer survivors from the United States. One hundred and eleven women from the National Health and Nutrition Examination Survey (NHANES) 2003-2004 and 2005-2006 reported a history of breast cancer. Participants wore an accelerometer for 7 days, and activity levels were summarized as moderate-to-vigorous intensity (accelerometer counts/min > or =1,952), light intensity (counts/min 100-1,951), and sedentary time (counts/min <100). Anthropometric measures were taken by study staff at examination centers. Participants spent the majority of their day in sedentary time (66%) or in light intensity activities (33%). Log moderate-to-vigorous intensity physical activity was negatively associated with adiposity (waist circumference beta = -9.805 [95% CI: -15.836, -3.775]; BMI beta = -3.576 [95% CI: -6.687, -0.464]). Light intensity physical activity was negatively associated with adiposity; however, the fully adjusted models did not retain statistical significance. Similarly, sedentary time was positively associated with adiposity, but the fully adjusted models were not statistically significant. This is the first study to describe the objectively assessed physical activity and sedentary time of breast cancer survivors. Increasing moderate-to-vigorous and light intensity physical activity, and decreasing sedentary time, may assist with weight management and improve other metabolic health outcomes for breast cancer survivors.
From entropy-maximization to equality-maximization: Gauss, Laplace, Pareto, and Subbotin
NASA Astrophysics Data System (ADS)
Eliazar, Iddo
2014-12-01
The entropy-maximization paradigm of statistical physics is well known to generate the omnipresent Gauss law. In this paper we establish an analogous socioeconomic model which maximizes social equality, rather than physical disorder, in the context of the distributions of income and wealth in human societies. We show that-on a logarithmic scale-the Laplace law is the socioeconomic equality-maximizing counterpart of the physical entropy-maximizing Gauss law, and that this law manifests an optimized balance between two opposing forces: (i) the rich and powerful, striving to amass ever more wealth, and thus to increase social inequality; and (ii) the masses, struggling to form more egalitarian societies, and thus to increase social equality. Our results lead from log-Gauss statistics to log-Laplace statistics, yield Paretian power-law tails of income and wealth distributions, and show how the emergence of a middle-class depends on the underlying levels of socioeconomic inequality and variability. Also, in the context of asset-prices with Laplace-distributed returns, our results imply that financial markets generate an optimized balance between risk and predictability.
Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel
2016-07-20
A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nosedal-Sanchez, Alvaro; Jackson, Charles S.; Huerta, Gabriel
A new test statistic for climate model evaluation has been developed that potentially mitigates some of the limitations that exist for observing and representing field and space dependencies of climate phenomena. Traditionally such dependencies have been ignored when climate models have been evaluated against observational data, which makes it difficult to assess whether any given model is simulating observed climate for the right reasons. The new statistic uses Gaussian Markov random fields for estimating field and space dependencies within a first-order grid point neighborhood structure. We illustrate the ability of Gaussian Markov random fields to represent empirical estimates of fieldmore » and space covariances using "witch hat" graphs. We further use the new statistic to evaluate the tropical response of a climate model (CAM3.1) to changes in two parameters important to its representation of cloud and precipitation physics. Overall, the inclusion of dependency information did not alter significantly the recognition of those regions of parameter space that best approximated observations. However, there were some qualitative differences in the shape of the response surface that suggest how such a measure could affect estimates of model uncertainty.« less
A social discounting model based on Tsallis’ statistics
NASA Astrophysics Data System (ADS)
Takahashi, Taiki
2010-09-01
Social decision making (e.g. social discounting and social preferences) has been attracting attention in economics, econophysics, social physics, behavioral psychology, and neuroeconomics. This paper proposes a novel social discounting model based on the deformed algebra developed in the Tsallis’ non-extensive thermostatistics. Furthermore, it is suggested that this model can be utilized to quantify the degree of consistency in social discounting in humans and analyze the relationships between behavioral tendencies in social discounting and other-regarding economic decision making under game-theoretic conditions. Future directions in the application of the model to studies in econophysics, neuroeconomics, and social physics, as well as real-world problems such as the supply of live organ donations, are discussed.
Physical models of collective cell motility: from cell to tissue
NASA Astrophysics Data System (ADS)
Camley, B. A.; Rappel, W.-J.
2017-03-01
In this article, we review physics-based models of collective cell motility. We discuss a range of techniques at different scales, ranging from models that represent cells as simple self-propelled particles to phase field models that can represent a cell’s shape and dynamics in great detail. We also extensively review the ways in which cells within a tissue choose their direction, the statistics of cell motion, and some simple examples of how cell-cell signaling can interact with collective cell motility. This review also covers in more detail selected recent works on collective cell motion of small numbers of cells on micropatterns, in wound healing, and the chemotaxis of clusters of cells.
Statistical strategy for anisotropic adventitia modelling in IVUS.
Gil, Debora; Hernández, Aura; Rodriguez, Oriol; Mauri, Josepa; Radeva, Petia
2006-06-01
Vessel plaque assessment by analysis of intravascular ultrasound sequences is a useful tool for cardiac disease diagnosis and intervention. Manual detection of luminal (inner) and media-adventitia (external) vessel borders is the main activity of physicians in the process of lumen narrowing (plaque) quantification. Difficult definition of vessel border descriptors, as well as, shades, artifacts, and blurred signal response due to ultrasound physical properties trouble automated adventitia segmentation. In order to efficiently approach such a complex problem, we propose blending advanced anisotropic filtering operators and statistical classification techniques into a vessel border modelling strategy. Our systematic statistical analysis shows that the reported adventitia detection achieves an accuracy in the range of interobserver variability regardless of plaque nature, vessel geometry, and incomplete vessel borders.
A physical-based gas-surface interaction model for rarefied gas flow simulation
NASA Astrophysics Data System (ADS)
Liang, Tengfei; Li, Qi; Ye, Wenjing
2018-01-01
Empirical gas-surface interaction models, such as the Maxwell model and the Cercignani-Lampis model, are widely used as the boundary condition in rarefied gas flow simulations. The accuracy of these models in the prediction of macroscopic behavior of rarefied gas flows is less satisfactory in some cases especially the highly non-equilibrium ones. Molecular dynamics simulation can accurately resolve the gas-surface interaction process at atomic scale, and hence can predict accurate macroscopic behavior. They are however too computationally expensive to be applied in real problems. In this work, a statistical physical-based gas-surface interaction model, which complies with the basic relations of boundary condition, is developed based on the framework of the washboard model. In virtue of its physical basis, this new model is capable of capturing some important relations/trends for which the classic empirical models fail to model correctly. As such, the new model is much more accurate than the classic models, and in the meantime is more efficient than MD simulations. Therefore, it can serve as a more accurate and efficient boundary condition for rarefied gas flow simulations.
Comparative Research Productivity Measures for Economic Departments.
ERIC Educational Resources Information Center
Huettner, David A.; Clark, William
1997-01-01
Develops a simple theoretical model to evaluate interdisciplinary differences in research productivity between economics departments and related subjects. Compares the research publishing statistics of economics, finance, psychology, geology, physics, oceanography, chemistry, and geophysics. Considers a number of factors including journal…
Physics and financial economics (1776-2014): puzzles, Ising and agent-based models.
Sornette, Didier
2014-06-01
This short review presents a selected history of the mutual fertilization between physics and economics--from Isaac Newton and Adam Smith to the present. The fundamentally different perspectives embraced in theories developed in financial economics compared with physics are dissected with the examples of the volatility smile and of the excess volatility puzzle. The role of the Ising model of phase transitions to model social and financial systems is reviewed, with the concepts of random utilities and the logit model as the analog of the Boltzmann factor in statistical physics. Recent extensions in terms of quantum decision theory are also covered. A wealth of models are discussed briefly that build on the Ising model and generalize it to account for the many stylized facts of financial markets. A summary of the relevance of the Ising model and its extensions is provided to account for financial bubbles and crashes. The review would be incomplete if it did not cover the dynamical field of agent-based models (ABMs), also known as computational economic models, of which the Ising-type models are just special ABM implementations. We formulate the 'Emerging Intelligence Market Hypothesis' to reconcile the pervasive presence of 'noise traders' with the near efficiency of financial markets. Finally, we note that evolutionary biology, more than physics, is now playing a growing role to inspire models of financial markets.
Physics and financial economics (1776-2014): puzzles, Ising and agent-based models
NASA Astrophysics Data System (ADS)
Sornette, Didier
2014-06-01
This short review presents a selected history of the mutual fertilization between physics and economics—from Isaac Newton and Adam Smith to the present. The fundamentally different perspectives embraced in theories developed in financial economics compared with physics are dissected with the examples of the volatility smile and of the excess volatility puzzle. The role of the Ising model of phase transitions to model social and financial systems is reviewed, with the concepts of random utilities and the logit model as the analog of the Boltzmann factor in statistical physics. Recent extensions in terms of quantum decision theory are also covered. A wealth of models are discussed briefly that build on the Ising model and generalize it to account for the many stylized facts of financial markets. A summary of the relevance of the Ising model and its extensions is provided to account for financial bubbles and crashes. The review would be incomplete if it did not cover the dynamical field of agent-based models (ABMs), also known as computational economic models, of which the Ising-type models are just special ABM implementations. We formulate the ‘Emerging Intelligence Market Hypothesis’ to reconcile the pervasive presence of ‘noise traders’ with the near efficiency of financial markets. Finally, we note that evolutionary biology, more than physics, is now playing a growing role to inspire models of financial markets.
NASA Astrophysics Data System (ADS)
Young, S. L.; Kress, B. T.; Rodriguez, J. V.; McCollough, J. P.
2013-12-01
Operational specifications of space environmental hazards can be an important input used by decision makers. Ideally the specification would come from on-board sensors, but for satellites where that capability is not available another option is to map data from remote observations to the location of the satellite. This requires a model of the physical environment and an understanding of its accuracy for mapping applications. We present a statistical comparison between magnetic field model mappings of solar energetic particle observations made by NOAA's Geostationary Operational Environmental Satellites (GOES) to the location of the Combined Release and Radiation Effects Satellite (CRRES). Because CRRES followed a geosynchronous transfer orbit which precessed in local time this allows us to examine the model accuracy between LEO and GEO orbits across a range of local times. We examine the accuracy of multiple magnetic field models using a variety of statistics and examine their utility for operational purposes.
Statistical Physics Experiments Using Dusty Plasmas
NASA Astrophysics Data System (ADS)
Goree, John
2016-10-01
Compared to other areas of physics research, Statistical Physics is heavily dominated by theory, with comparatively little experiment. One reason for the lack of experiments is the impracticality of tracking of individual atoms and molecules within a substance. Thus, there is a need for a different kind of experimental system, one where individual particles not only move stochastically as they collide with one another, but also are large enough to allow tracking. A dusty plasma can meet this need. A dusty plasma is a partially ionized gas containing small particles of solid matter. These micron-size particles gain thousands of electronic charges by collecting more electrons than ions. Their motions are dominated by Coulomb collisions with neighboring particles. In this so-called strongly coupled plasma, the dust particles self-organize in much the same way as atoms in a liquid or solid. Unlike atoms, however, these particles are large and slow, so that they can be tracked easily by video microscopy. Advantages of dusty plasma for experimental statistical physics research include particle tracking, lack of frictional contact with solid surfaces, and avoidance of overdamped motion. Moreover, the motion of a collection of dust particles can mimic an equilibrium system with a Maxwellian velocity distribution, even though the dust particles themselves are not truly in thermal equilibrium. Nonequilibrium statistical physics can be studied by applying gradients, for example by imposing a shear flow. In this talk I will review some of our recent experiments with shear flow. First, we performed the first experimental test to verify the Fluctuation Theorem for a shear flow, showing that brief violations of the Second Law of Thermodynamics occur with the predicted probabilities, for a small system. Second, we discovered a skewness of a shear-stress distribution in a shear flow. This skewness is a phenomenon that likely has wide applicability in nonequilibrium steady states. Third, we performed the first experimental test of a statistical physics theory (the Green-Kubo model) that is widely used by physical chemists to compute viscosity coefficients, and we found that it fails. Work supported by the U.S. Department of Energy, NSF, and NASA.
Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.
Liu, Xinzijian; Liu, Jian
2018-03-14
An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.
Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems
NASA Astrophysics Data System (ADS)
Liu, Xinzijian; Liu, Jian
2018-03-01
An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.
Estimating extreme river discharges in Europe through a Bayesian network
NASA Astrophysics Data System (ADS)
Paprotny, Dominik; Morales-Nápoles, Oswaldo
2017-06-01
Large-scale hydrological modelling of flood hazards requires adequate extreme discharge data. In practise, models based on physics are applied alongside those utilizing only statistical analysis. The former require enormous computational power, while the latter are mostly limited in accuracy and spatial coverage. In this paper we introduce an alternate, statistical approach based on Bayesian networks (BNs), a graphical model for dependent random variables. We use a non-parametric BN to describe the joint distribution of extreme discharges in European rivers and variables representing the geographical characteristics of their catchments. Annual maxima of daily discharges from more than 1800 river gauges (stations with catchment areas ranging from 1.4 to 807 000 km2) were collected, together with information on terrain, land use and local climate. The (conditional) correlations between the variables are modelled through copulas, with the dependency structure defined in the network. The results show that using this method, mean annual maxima and return periods of discharges could be estimated with an accuracy similar to existing studies using physical models for Europe and better than a comparable global statistical model. Performance of the model varies slightly between regions of Europe, but is consistent between different time periods, and remains the same in a split-sample validation. Though discharge prediction under climate change is not the main scope of this paper, the BN was applied to a large domain covering all sizes of rivers in the continent both for present and future climate, as an example. Results show substantial variation in the influence of climate change on river discharges. The model can be used to provide quick estimates of extreme discharges at any location for the purpose of obtaining input information for hydraulic modelling.
Jacob LaFontaine; Lauren Hay; Stacey Archfield; William Farmer; Julie Kiang
2016-01-01
The U.S. Geological Survey (USGS) has developed a National Hydrologic Model (NHM) to support coordinated, comprehensive and consistent hydrologic model development, and facilitate the application of hydrologic simulations within the continental US. The portion of the NHM located within the Gulf Coastal Plains and Ozarks Landscape Conservation Cooperative (GCPO LCC) is...
Territorial Developments Based on Graffiti: a Statistical Mechanics Approach
2011-10-28
defined on a lattice . We introduce a two-gang Hamiltonian model where agents have red or blue affiliation but are otherwise indistinguishable. In this...ramifications of our results. Keywords: Territorial Formation, Spin Systems, Phase Transitions 1. Introduction Lattice models have been extensively used in...inconsequential. In short, lattice models have proved extremely useful in the context of the physical, biological and even chemical sciences. In more
Greenwald, Jeffrey L; Cronin, Patrick R; Carballo, Victoria; Danaei, Goodarz; Choy, Garry
2017-03-01
With the increasing focus on reducing hospital readmissions in the United States, numerous readmissions risk prediction models have been proposed, mostly developed through analyses of structured data fields in electronic medical records and administrative databases. Three areas that may have an impact on readmission but are poorly captured using structured data sources are patients' physical function, cognitive status, and psychosocial environment and support. The objective of the study was to build a discriminative model using information germane to these 3 areas to identify hospitalized patients' risk for 30-day all cause readmissions. We conducted clinician focus groups to identify language used in the clinical record regarding these 3 areas. We then created a dataset including 30,000 inpatients, 10,000 from each of 3 hospitals, and searched those records for the focus group-derived language using natural language processing. A 30-day readmission prediction model was developed on 75% of the dataset and validated on the other 25% and also on hospital specific subsets. Focus group language was aggregated into 35 variables. The final model had 16 variables, a validated C-statistic of 0.74, and was well calibrated. Subset validation of the model by hospital yielded C-statistics of 0.70-0.75. Deriving a 30-day readmission risk prediction model through identification of physical, cognitive, and psychosocial issues using natural language processing yielded a model that performs similarly to the better performing models previously published with the added advantage of being based on clinically relevant factors and also automated and scalable. Because of the clinical relevance of the variables in the model, future research may be able to test if targeting interventions to identified risks results in reductions in readmissions.
New mathematics for old physics: The case of lattice fluids
NASA Astrophysics Data System (ADS)
Barberousse, Anouk; Imbert, Cyrille
2013-08-01
We analyze the effects of the introduction of new mathematical tools on an old branch of physics by focusing on lattice fluids, which are cellular automata (CA)-based hydrodynamical models. We examine the nature of these discrete models, the type of novelty they bring about within scientific practice and the role they play in the field of fluid dynamics. We critically analyze Rohrlich's, Fox Keller's and Hughes' claims about CA-based models. We distinguish between different senses of the predicates "phenomenological" and "theoretical" for scientific models and argue that it is erroneous to conclude, as they do, that CA-based models are necessarily phenomenological in any sense of the term. We conversely claim that CA-based models of fluids, though at first sight blatantly misrepresenting fluids, are in fact conservative as far as the basic laws of statistical physics are concerned and not less theoretical than more traditional models in the field. Based on our case-study, we propose a general discussion of the prospect of CA for modeling in physics. We finally emphasize that lattice fluids are not just exotic oddities but do bring about new advantages in the investigation of fluids' behavior.
Extracting Primordial Non-Gaussianity from Large Scale Structure in the Post-Planck Era
NASA Astrophysics Data System (ADS)
Dore, Olivier
Astronomical observations have become a unique tool to probe fundamental physics. Cosmology, in particular, emerged as a data-driven science whose phenomenological modeling has achieved great success: in the post-Planck era, key cosmological parameters are measured to percent precision. A single model reproduces a wealth of astronomical observations involving very distinct physical processes at different times. This success leads to fundamental physical questions. One of the most salient is the origin of the primordial perturbations that grew to form the large-scale structures we now observe. More and more cosmological observables point to inflationary physics as the origin of the structure observed in the universe. Inflationary physics predict the statistical properties of the primordial perturbations and it is thought to be slightly non-Gaussian. The detection of this small deviation from Gaussianity represents the next frontier in early Universe physics. To measure it would provide direct, unique and quantitative insights about the physics at play when the Universe was only a fraction of a second old, thus probing energies untouchable otherwise. En par with the well-known relic gravitational wave radiation -- the famous ``B-modes'' -- it is one the few probes of inflation. This departure from Gaussianity leads to very specific signature in the large scale clustering of galaxies. Observing large-scale structure, we can thus establish a direct connection with fundamental theories of the early universe. In the post-Planck era, large-scale structures are our most promising pathway to measuring this primordial signal. Current estimates suggests that the next generation of space or ground based large scale structure surveys (e.g. the ESA EUCLID or NASA WFIRST missions) might enable a detection of this signal. This potential huge payoff requires us to solidify the theoretical predictions supporting these measurements. Even if the exact signal we are looking for is of unknown amplitude, it is obvious that we must measure it as well as these ground breaking data set will permit. We propose to develop the supporting theoretical work to the point where the complete non-gaussianian signature can be extracted from these data sets. We will do so by developing three complementary directions: - We will develop the appropriate formalism to measure and model galaxy clustering on the largest scales. - We will study the impact of non-Gaussianity on higher-order statistics, the most promising statistics for our purpose.. - We will explicit the connection between these observables and the microphysics of a large class of inflation models, but also identify fundamental limitations to this interpretation.
Building mental models by dissecting physical models.
Srivastava, Anveshna
2016-01-01
When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to ensure focused learning; models that are too constrained require less supervision, but can be constructed mechanically, with little to no conceptual engagement. We propose "model-dissection" as an alternative to "model-building," whereby instructors could make efficient use of supervisory resources, while simultaneously promoting focused learning. We report empirical results from a study conducted with biology undergraduate students, where we demonstrate that asking them to "dissect" out specific conceptual structures from an already built 3D physical model leads to a significant improvement in performance than asking them to build the 3D model from simpler components. Using questionnaires to measure understanding both before and after model-based interventions for two cohorts of students, we find that both the "builders" and the "dissectors" improve in the post-test, but it is the latter group who show statistically significant improvement. These results, in addition to the intrinsic time-efficiency of "model dissection," suggest that it could be a valuable pedagogical tool. © 2015 The International Union of Biochemistry and Molecular Biology.
Urns and Chameleons: two metaphors for two different types of measurements
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2013-09-01
The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.
The effect of the flipped model on achievement in an introductory college physics course
NASA Astrophysics Data System (ADS)
Winter, Joshua Brian
The flipped or inverted classroom model is one in which the time and place for traditional lecture and homework are reversed. Traditional lecture is replaced by online videos assigned as homework. This frees up time in class to be spent with more student centered activities such as discussion based concept questions and group problem solving. While growing in popularity, research on the effectiveness of this format is sparse. In this quasi-experimental study, two sections of an introductory algebra-based college physics course were examined over a five week period. Each section was taught with either the traditional or flipped model and physics knowledge achieved was compared using independent samples t-tests on both the instructor's unit exam and the Mechanics Baseline Test pre/posttest normalized gain. Results indicated that there was no statistically significant difference between the flipped model and the traditional lecture format. Avenues for further research are discussed.
An undergraduate course, and new textbook, on ``Physical Models of Living Systems''
NASA Astrophysics Data System (ADS)
Nelson, Philip
2015-03-01
I'll describe an intermediate-level course on ``Physical Models of Living Systems.'' The only prerequisite is first-year university physics and calculus. The course is a response to rapidly growing interest among undergraduates in several science and engineering departments. Students acquire several research skills that are often not addressed in traditional courses, including: basic modeling skills, probabilistic modeling skills, data analysis methods, computer programming using a general-purpose platform like MATLAB or Python, dynamical systems, particularly feedback control. These basic skills, which are relevant to nearly any field of science or engineering, are presented in the context of case studies from living systems, including: virus dynamics; bacterial genetics and evolution of drug resistance; statistical inference; superresolution microscopy; synthetic biology; naturally evolved cellular circuits. Publication of a new textbook by WH Freeman and Co. is scheduled for December 2014. Supported in part by EF-0928048 and DMR-0832802.
Forecasting Ocean Waves: Comparing a Physics-Based Model with Statistical Models
2011-01-01
m) 46029 (135 m) 46211 (38 m) ( CDIP -036) 42039 (307 m) 42040 (165 m) 42007 (14 m) Boundary forcing from NCEP WW3 ENP 15′×15′ resolution SWAN CNW-G1...wave energy. Acronyms and abbreviations CenGOOS Central Gulf Ocean Observing System CDIP Coastal Data Information Program CNW Coastal Northwest SWAN
Geospace environment modeling 2008--2009 challenge: Dst index
Rastätter, L.; Kuznetsova, M.M.; Glocer, A.; Welling, D.; Meng, X.; Raeder, J.; Wittberger, M.; Jordanova, V.K.; Yu, Y.; Zaharia, S.; Weigel, R.S.; Sazykin, S.; Boynton, R.; Wei, H.; Eccles, V.; Horton, W.; Mays, M.L.; Gannon, J.
2013-01-01
This paper reports the metrics-based results of the Dst index part of the 2008–2009 GEM Metrics Challenge. The 2008–2009 GEM Metrics Challenge asked modelers to submit results for four geomagnetic storm events and five different types of observations that can be modeled by statistical, climatological or physics-based models of the magnetosphere-ionosphere system. We present the results of 30 model settings that were run at the Community Coordinated Modeling Center and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations, we use comparisons of 1 hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of 1 minute model data with the 1 minute Dst index calculated by the United States Geological Survey. The latter index can be used to calculate spectral variability of model outputs in comparison to the index. We find that model rankings vary widely by skill score used. None of the models consistently perform best for all events. We find that empirical models perform well in general. Magnetohydrodynamics-based models of the global magnetosphere with inner magnetosphere physics (ring current model) included and stand-alone ring current models with properly defined boundary conditions perform well and are able to match or surpass results from empirical models. Unlike in similar studies, the statistical models used in this study found their challenge in the weakest events rather than the strongest events.
Tilson, Julie K; Marshall, Katie; Tam, Jodi J; Fetters, Linda
2016-04-22
A primary barrier to the implementation of evidence based practice (EBP) in physical therapy is therapists' limited ability to understand and interpret statistics. Physical therapists demonstrate limited skills and report low self-efficacy for interpreting results of statistical procedures. While standards for physical therapist education include statistics, little empirical evidence is available to inform what should constitute such curricula. The purpose of this study was to conduct a census of the statistical terms and study designs used in physical therapy literature and to use the results to make recommendations for curricular development in physical therapist education. We conducted a bibliometric analysis of 14 peer-reviewed journals associated with the American Physical Therapy Association over 12 months (Oct 2011-Sept 2012). Trained raters recorded every statistical term appearing in identified systematic reviews, primary research reports, and case series and case reports. Investigator-reported study design was also recorded. Terms representing the same statistical test or concept were combined into a single, representative term. Cumulative percentage was used to identify the most common representative statistical terms. Common representative terms were organized into eight categories to inform curricular design. Of 485 articles reviewed, 391 met the inclusion criteria. These 391 articles used 532 different terms which were combined into 321 representative terms; 13.1 (sd = 8.0) terms per article. Eighty-one representative terms constituted 90% of all representative term occurrences. Of the remaining 240 representative terms, 105 (44%) were used in only one article. The most common study design was prospective cohort (32.5%). Physical therapy literature contains a large number of statistical terms and concepts for readers to navigate. However, in the year sampled, 81 representative terms accounted for 90% of all occurrences. These "common representative terms" can be used to inform curricula to promote physical therapists' skills, competency, and confidence in interpreting statistics in their professional literature. We make specific recommendations for curriculum development informed by our findings.
NASA Astrophysics Data System (ADS)
Navascués, Miguel
2014-02-01
In 2003, Leggett introduced his model of crypto-nonlocality based on considerations on the reality of photon polarization [A. J. Leggett, Found. Phys. 33, 1469 (2003), 10.1023/A:1026096313729]. In this paper, we prove that, contrary to hints in subsequent literature, crypto-nonlocality does not follow naturally from the postulate that polarization is a realistic variable. More explicitly, consider physical theories where (a) faster-than-light communication is impossible, (b) all physical photon states have a definite polarization, and (c) given two separate photons, if we measure one of them and post-select on the result, the measurement statistics of the remaining system correspond to a photon state. We show that the outcomes of any two-photon polarization experiment in these theories must follow the statistics generated by measuring a separable two-qubit quantum state. Consequently, in such experiments any instance of entanglement detection—and not necessarily a Leggett inequality violation—can be regarded as a refutation of this class of theories.
Automated sampling assessment for molecular simulations using the effective sample size
Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.
2010-01-01
To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418
Millimeter wave attenuation prediction using a piecewise uniform rain rate model
NASA Technical Reports Server (NTRS)
Persinger, R. R.; Stutzman, W. L.; Bostian, C. W.; Castle, R. E., Jr.
1980-01-01
A piecewise uniform rain rate distribution model is introduced as a quasi-physical model of real rain along earth-space millimeter wave propagation paths. It permits calculation of the total attenuation from specific attenuation in a simple fashion. The model predications are verified by comparison with direct attenuation measurements for several frequencies, elevation angles, and locations. Also, coupled with the Rice-Holmberg rain rate model, attenuation statistics are predicated from rainfall accumulation data.
Objectively-Measured Physical Activity and Cognitive Functioning in Breast Cancer Survivors
Marinac, Catherine R.; Godbole, Suneeta; Kerr, Jacqueline; Natarajan, Loki; Patterson, Ruth E.; Hartman, Sheri J.
2015-01-01
Purpose To explore the relationship between objectively measured physical activity and cognitive functioning in breast cancer survivors. Methods Participants were 136 postmenopausal breast cancer survivors. Cognitive functioning was assessed using a comprehensive computerized neuropsychological test. 7-day physical activity was assessed using hip-worn accelerometers. Linear regression models examined associations of minutes per day of physical activity at various intensities on individual cognitive functioning domains. The partially adjusted model controlled for primary confounders (model 1), and subsequent adjustments were made for chemotherapy history (model 2), and BMI (model 3). Interaction and stratified models examined BMI as an effect modifier. Results Moderate-to-vigorous physical activity (MVPA) was associated with Information Processing Speed. Specifically, ten minutes of MVPA was associated with a 1.35-point higher score (out of 100) on the Information Processing Speed domain in the partially adjusted model, and a 1.29-point higher score when chemotherapy was added to the model (both p<.05). There was a significant BMI x MVPA interaction (p=.051). In models stratified by BMI (<25 vs. ≥25 kg/m2), the favorable association between MVPA and Information Processing Speed was stronger in the subsample of overweight and obese women (p<.05), but not statistically significant in the leaner subsample. Light-intensity physical activity was not significantly associated with any of the measured domains of cognitive function. Conclusions MVPA may have favorable effects on Information Processing Speed in breast cancer survivors, particularly among overweight or obese women. Implications for Cancer Survivors Interventions targeting increased physical activity may enhance aspects of cognitive function among breast cancer survivors. PMID:25304986
Applied technology section. Monthly report, March 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckner, M.R.
1994-04-20
This is a monthly report giving the details on research currently being conducted at the Savannah River Technology Center. The following are areas of the research, engineering modeling and simulation, applied statistics, applied physics,experimental thermal hydraulics,and packaging and transportation.
Statistical physics approach to earthquake occurrence and forecasting
NASA Astrophysics Data System (ADS)
de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio
2016-04-01
There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.
James, Andrew I W; Young, Andrew W
2013-01-01
To explore the relationships between verbal aggression, physical aggression and inappropriate sexual behaviour following acquired brain injury. Multivariate statistical modelling of observed verbal aggression, physical aggression and inappropriate sexual behaviour utilizing demographic, pre-morbid, injury-related and neurocognitive predictors. Clinical records of 152 participants with acquired brain injury were reviewed, providing an important data set as disordered behaviours had been recorded at the time of occurrence with the Brain Injury Rehabilitation Trust (BIRT) Aggression Rating Scale and complementary measures of inappropriate sexual behaviour. Three behavioural components (verbal aggression, physical aggression and inappropriate sexual behaviour) were identified and subjected to separate logistical regression modelling in a sub-set of 77 participants. Successful modelling was achieved for both verbal and physical aggression (correctly classifying 74% and 65% of participants, respectively), with use of psychotropic medication and poorer verbal function increasing the odds of aggression occurring. Pre-morbid history of aggression predicted verbal but not physical aggression. No variables predicted inappropriate sexual behaviour. Verbal aggression, physical aggression and inappropriate sexual behaviour following acquired brain injury appear to reflect separate clinical phenomena rather than general behavioural dysregulation. Clinical markers that indicate an increased risk of post-injury aggression were not related to inappropriate sexual behaviour.
Fracture Networks from a deterministic physical model as 'forerunners' of Maze Caves
NASA Astrophysics Data System (ADS)
Ferer, M. V.; Smith, D. H.; Lace, M. J.
2013-12-01
'Fractures are the chief forerunners of caves because they transmit water much more rapidly than intergranular pores.[1] Thus, the cave networks can follow the fracture networks from which the Karst caves formed by a variety of processes. Traditional models of continental Karst define water flow through subsurface geologic formations, slowly dissolving the rock along the pathways (e.g. water saturated with respect to carbon dioxide flowing through fractured carbonate formations). We have developed a deterministic, physical model of fracturing in a model geologic layer of a given thickness, when that layer is strained in one direction and subsequently in a perpendicular direction. It was observed that the connected fracture networks from our model visually resemble maps of maze caves. Since these detailed cave maps offer critical tools in modeling cave development patterns and conduit flow in Karst systems, we were able to test the qualitative resemblance by using statistical analyses to compare our model networks in geologic layers of four different thicknesses with the corresponding statistical analyses of four different maze caves, formed in a variety of geologic settings. The statistical studies performed are: i) standard box-counting to determine if either the caves or the model networks are fractal. We found that both are fractal with a fractal dimension Df ≈ 1.75 . ii) for each section inside a closed path, we determined the area and perimeter-length, enabling a study of the tortuosity of the networks. From the dependence of the section's area upon its perimeter-length, we have found a power-law behavior (for sufficiently large sections) characterized by a 'tortuosity' exponent. These exponents have similar values for both the model networks and the maze caves. The best agreement is between our thickest model layer and the maze-like part of Wind Cave in South Dakota where the data from the model and the cave overlie each other. For the present networks from the physical model, we assumed that the geologic layer was of uniform thickness and that the strain in both directions were the same. The latter may not be the case for the Brazilian, Toca de Boa Cave. These assumptions can be easily modified in our computer code to reflect different geologic histories. Even so the quantitative agreement suggests that our model networks are statistically realistic both for the 'forerunners' of caves and for general fracture networks in geologic layers, which should assist the study of underground fluid flow in many applications for which fracture patterns and fluid flow are difficult to determine (e.g., hydrology, watershed management, oil recovery, carbon dioxide sequestration, etc.). Keywords - Fracture Networks, Karst, Caves, Structurally Variable Pathways, hydrogeological modeling 1 Arthur N. Palmer, CAVE GEOLOGY, pub. Cave Books, Dayton OH, (2007).
NASA Astrophysics Data System (ADS)
Saakian, David B.
2012-03-01
We map the Markov-switching multifractal model (MSM) onto the random energy model (REM). The MSM is, like the REM, an exactly solvable model in one-dimensional space with nontrivial correlation functions. According to our results, four different statistical physics phases are possible in random walks with multifractal behavior. We also introduce the continuous branching version of the model, calculate the moments, and prove multiscaling behavior. Different phases have different multiscaling properties.
Near Earth Asteroid Characteristics for Asteroid Threat Assessment
NASA Technical Reports Server (NTRS)
Dotson, Jessie
2015-01-01
Information about the physical characteristics of Near Earth Asteroids (NEAs) is needed to model behavior during atmospheric entry, to assess the risk of an impact, and to model possible mitigation techniques. The intrinsic properties of interest to entry and mitigation modelers, however, rarely are directly measureable. Instead we measure other properties and infer the intrinsic physical properties, so determining the complete set of characteristics of interest is far from straightforward. In addition, for the majority of NEAs, only the basic measurements exist so often properties must be inferred from statistics of the population of more completely characterized objects. We will provide an assessment of the current state of knowledge about the physical characteristics of importance to asteroid threat assessment. In addition, an ongoing effort to collate NEA characteristics into a readily accessible database for use by the planetary defense community will be discussed.
Patterson, Megan S; Goodson, Patricia
2017-05-01
Compulsive exercise, a form of unhealthy exercise often associated with prioritizing exercise and feeling guilty when exercise is missed, is a common precursor to and symptom of eating disorders. College-aged women are at high risk of exercising compulsively compared with other groups. Social network analysis (SNA) is a theoretical perspective and methodology allowing researchers to observe the effects of relational dynamics on the behaviors of people. SNA was used to assess the relationship between compulsive exercise and body dissatisfaction, physical activity, and network variables. Descriptive statistics were conducted using SPSS, and quadratic assignment procedure (QAP) analyses were conducted using UCINET. QAP regression analysis revealed a statistically significant model (R 2 = .375, P < .0001) predicting compulsive exercise behavior. Physical activity, body dissatisfaction, and network variables were statistically significant predictor variables in the QAP regression model. In our sample, women who are connected to "important" or "powerful" people in their network are likely to have higher compulsive exercise scores. This result provides healthcare practitioners key target points for intervention within similar groups of women. For scholars researching eating disorders and associated behaviors, this study supports looking into group dynamics and network structure in conjunction with body dissatisfaction and exercise frequency.
Evaluating performances of simplified physically based landslide susceptibility models.
NASA Astrophysics Data System (ADS)
Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale
2015-04-01
Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk Monitoring, Early Warning and Mitigation Along the Main Lifelines", CUP B31H11000370005, in the framework of the National Operational Program for "Research and Competitiveness" 2007-2013.
Perspective: Sloppiness and emergent theories in physics, biology, and beyond.
Transtrum, Mark K; Machta, Benjamin B; Brown, Kevin S; Daniels, Bryan C; Myers, Christopher R; Sethna, James P
2015-07-07
Large scale models of physical phenomena demand the development of new statistical and computational tools in order to be effective. Many such models are "sloppy," i.e., exhibit behavior controlled by a relatively small number of parameter combinations. We review an information theoretic framework for analyzing sloppy models. This formalism is based on the Fisher information matrix, which is interpreted as a Riemannian metric on a parameterized space of models. Distance in this space is a measure of how distinguishable two models are based on their predictions. Sloppy model manifolds are bounded with a hierarchy of widths and extrinsic curvatures. The manifold boundary approximation can extract the simple, hidden theory from complicated sloppy models. We attribute the success of simple effective models in physics as likewise emerging from complicated processes exhibiting a low effective dimensionality. We discuss the ramifications and consequences of sloppy models for biochemistry and science more generally. We suggest that the reason our complex world is understandable is due to the same fundamental reason: simple theories of macroscopic behavior are hidden inside complicated microscopic processes.
Theoretical approaches to the steady-state statistical physics of interacting dissipative units
NASA Astrophysics Data System (ADS)
Bertin, Eric
2017-02-01
The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.
A generalized statistical model for the size distribution of wealth
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2012-12-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.
NASA Astrophysics Data System (ADS)
Green, Rebecca E.; Gould, Richard W., Jr.; Ko, Dong S.
2008-06-01
We developed statistically-based, optical models to estimate tripton (sediment/detrital) and colored dissolved organic matter (CDOM) absorption coefficients ( a sd, a g) from physical hydrographic and atmospheric properties. The models were developed for northern Gulf of Mexico shelf waters using multi-year satellite and physical data. First, empirical algorithms for satellite-derived a sd and a g were developed, based on comparison with a large data set of cruise measurements from northern Gulf shelf waters; these algorithms were then applied to a time series of ocean color (SeaWiFS) satellite imagery for 2002-2005. Unique seasonal timing was observed in satellite-derived optical properties, with a sd peaking most often in fall/winter on the shelf, in contrast to summertime peaks observed in a g. Next, the satellite-derived values were coupled with the physical data to form multiple regression models. A suite of physical forcing variables were tested for inclusion in the models: discharge from the Mississippi River and Mobile Bay, Alabama; gridded fields for winds, precipitation, solar radiation, sea surface temperature and height (SST, SSH); and modeled surface salinity and currents (Navy Coastal Ocean Model, NCOM). For satellite-derived a sd and a g time series (2002-2004), correlation and stepwise regression analyses revealed the most important physical forcing variables. Over our region of interest, the best predictors of tripton absorption were wind speed, river discharge, and SST, whereas dissolved absorption was best predicted by east-west wind speed, river discharge, and river discharge lagged by 1 month. These results suggest the importance of vertical mixing (as a function of winds and thermal stratification) in controlling a sd distribution patterns over large regions of the shelf, in comparison to advection as the most important control on a g. The multiple linear regression models for estimating a sd and a g were applied on a pixel-by-pixel basis and results were compared to monthly SeaWiFS composite imagery. The models performed well in resolving seasonal and interannual optical variability in model development years (2002-2004) (mean error of 32% for a sd and 29% for a g) and in predicting shelfwide optical patterns in a year independent of model development (2005; mean error of 41% for a sd and 46% for a g). The models provide insight into the dominant processes controlling optical distributions in this region, and they can be used to predict the optical fields from the physical properties at monthly timescales.
Evaluation of WRF Parameterizations for Air Quality Applications over the Midwest USA
NASA Astrophysics Data System (ADS)
Zheng, Z.; Fu, K.; Balasubramanian, S.; Koloutsou-Vakakis, S.; McFarland, D. M.; Rood, M. J.
2017-12-01
Reliable predictions from Chemical Transport Models (CTMs) for air quality research require accurate gridded weather inputs. In this study, a sensitivity analysis of 17 Weather Research and Forecast (WRF) model runs was conducted to explore the optimum configuration in six physics categories (i.e., cumulus, surface layer, microphysics, land surface model, planetary boundary layer, and longwave/shortwave radiation) for the Midwest USA. WRF runs were initally conducted over four days in May 2011 for a 12 km x 12 km domain over contiguous USA and a nested 4 km x 4 km domain over the Midwest USA (i.e., Illinois and adjacent areas including Iowa, Indiana, and Missouri). Model outputs were evaluated statistically by comparison with meteorological observations (DS337.0, METAR data, and the Water and Atmospheric Resources Monitoring Network) and resulting statistics were compared to benchmark values from the literature. Identified optimum configurations of physics parametrizations were then evaluated for the whole months of May and October 2011 to evaluate WRF model performance for Midwestern spring and fall seasons. This study demonstrated that for the chosen physics options, WRF predicted well temperature (Index of Agreement (IOA) = 0.99), pressure (IOA = 0.99), relative humidity (IOA = 0.93), wind speed (IOA = 0.85), and wind direction (IOA = 0.97). However, WRF did not predict daily precipitation satisfactorily (IOA = 0.16). Developed gridded weather fields will be used as inputs to a CTM ensemble consisting of the Comprehensive Air Quality Model with Extensions to study impacts of chemical fertilizer usage on regional air quality in the Midwest USA.
NASA Astrophysics Data System (ADS)
Huang, Dong; Liu, Yangang
2014-12-01
Subgrid-scale variability is one of the main reasons why parameterizations are needed in large-scale models. Although some parameterizations started to address the issue of subgrid variability by introducing a subgrid probability distribution function for relevant quantities, the spatial structure has been typically ignored and thus the subgrid-scale interactions cannot be accounted for physically. Here we present a new statistical-physics-like approach whereby the spatial autocorrelation function can be used to physically capture the net effects of subgrid cloud interaction with radiation. The new approach is able to faithfully reproduce the Monte Carlo 3D simulation results with several orders less computational cost, allowing for more realistic representation of cloud radiation interactions in large-scale models.
Garrido-Balsells, José María; Jurado-Navas, Antonio; Paris, José Francisco; Castillo-Vazquez, Miguel; Puerta-Notario, Antonio
2015-03-09
In this paper, a novel and deeper physical interpretation on the recently published Málaga or ℳ statistical distribution is provided. This distribution, which is having a wide acceptance by the scientific community, models the optical irradiance scintillation induced by the atmospheric turbulence. Here, the analytical expressions previously published are modified in order to express them by a mixture of the known Generalized-K and discrete Binomial and Negative Binomial distributions. In particular, the probability density function (pdf) of the ℳ model is now obtained as a linear combination of these Generalized-K pdf, in which the coefficients depend directly on the parameters of the ℳ distribution. In this way, the Málaga model can be physically interpreted as a superposition of different optical sub-channels each of them described by the corresponding Generalized-K fading model and weighted by the ℳ dependent coefficients. The expressions here proposed are simpler than the equations of the original ℳ model and are validated by means of numerical simulations by generating ℳ -distributed random sequences and their associated histogram. This novel interpretation of the Málaga statistical distribution provides a valuable tool for analyzing the performance of atmospheric optical channels for every turbulence condition.
Discrete Model for the Structure and Strength of Cementitious Materials
NASA Astrophysics Data System (ADS)
Balopoulos, Victor D.; Archontas, Nikolaos; Pantazopoulou, Stavroula J.
2017-12-01
Cementitious materials are characterized by brittle behavior in direct tension and by transverse dilatation (due to microcracking) under compression. Microcracking causes increasingly larger transverse strains and a phenomenological Poisson's ratio that gradually increases to about ν =0.5 and beyond, at the limit point in compression. This behavior is due to the underlying structure of cementitious pastes which is simulated here with a discrete physical model. The computational model is generic, assembled from a statistically generated, continuous network of flaky dendrites consisting of cement hydrates that emanate from partially hydrated cement grains. In the actual amorphous material, the dendrites constitute the solid phase of the cement gel and interconnect to provide the strength and stiffness against load. The idealized dendrite solid is loaded in compression and tension to compute values for strength and Poisson's effects. Parametric studies are conducted, to calibrate the statistical parameters of the discrete model with the physical and mechanical characteristics of the material, so that the familiar experimental trends may be reproduced. The model provides a framework for the study of the mechanical behavior of the material under various states of stress and strain and can be used to model the effects of additives (e.g., fibers) that may be explicitly simulated in the discrete structure.
PARTICLE PHYSICS: CERN Collider Glimpses Supersymmetry--Maybe.
Seife, C
2000-07-14
Last week, particle physicists at the CERN laboratory in Switzerland announced that by smashing together matter and antimatter in four experiments, they detected an unexpected effect in the sprays of particles that ensued. The anomaly is subtle, and physicists caution that it might still be a statistical fluke. If confirmed, however, it could mark the long-sought discovery of a whole zoo of new particles--and the end of a long-standing model of particle physics.
Using entropy to cut complex time series
NASA Astrophysics Data System (ADS)
Mertens, David; Poncela Casasnovas, Julia; Spring, Bonnie; Amaral, L. A. N.
2013-03-01
Using techniques from statistical physics, physicists have modeled and analyzed human phenomena varying from academic citation rates to disease spreading to vehicular traffic jams. The last decade's explosion of digital information and the growing ubiquity of smartphones has led to a wealth of human self-reported data. This wealth of data comes at a cost, including non-uniform sampling and statistically significant but physically insignificant correlations. In this talk I present our work using entropy to identify stationary sub-sequences of self-reported human weight from a weight management web site. Our entropic approach-inspired by the infomap network community detection algorithm-is far less biased by rare fluctuations than more traditional time series segmentation techniques. Supported by the Howard Hughes Medical Institute
Thermodynamic limit of random partitions and dispersionless Toda hierarchy
NASA Astrophysics Data System (ADS)
Takasaki, Kanehisa; Nakatsu, Toshio
2012-01-01
We study the thermodynamic limit of random partition models for the instanton sum of 4D and 5D supersymmetric U(1) gauge theories deformed by some physical observables. The physical observables correspond to external potentials in the statistical model. The partition function is reformulated in terms of the density function of Maya diagrams. The thermodynamic limit is governed by a limit shape of Young diagrams associated with dominant terms in the partition function. The limit shape is characterized by a variational problem, which is further converted to a scalar-valued Riemann-Hilbert problem. This Riemann-Hilbert problem is solved with the aid of a complex curve, which may be thought of as the Seiberg-Witten curve of the deformed U(1) gauge theory. This solution of the Riemann-Hilbert problem is identified with a special solution of the dispersionless Toda hierarchy that satisfies a pair of generalized string equations. The generalized string equations for the 5D gauge theory are shown to be related to hidden symmetries of the statistical model. The prepotential and the Seiberg-Witten differential are also considered.
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Tulsky, David S.; Jette, Alan; Kisala, Pamela A.; Kalpakjian, Claire; Dijkers, Marcel P.; Whiteneck, Gale; Ni, Pengsheng; Kirshblum, Steven; Charlifue, Susan; Heinemann, Allen W.; Forchheimer, Martin; Slavin, Mary; Houlihan, Bethlyn; Tate, Denise; Dyson-Hudson, Trevor; Fyffe, Denise; Williams, Steve; Zanca, Jeanne
2012-01-01
Objective To develop a comprehensive set of patient reported items to assess multiple aspects of physical functioning relevant to the lives of people with spinal cord injury (SCI) and to evaluate the underlying structure of physical functioning. Design Cross-sectional Setting Inpatient and community Participants Item pools of physical functioning were developed, refined and field tested in a large sample of 855 individuals with traumatic spinal cord injury stratified by diagnosis, severity, and time since injury Interventions None Main Outcome Measure SCI-FI measurement system Results Confirmatory factor analysis (CFA) indicated that a 5-factor model, including basic mobility, ambulation, wheelchair mobility, self care, and fine motor, had the best model fit and was most closely aligned conceptually with feedback received from individuals with SCI and SCI clinicians. When just the items making up basic mobility were tested in CFA, the fit statistics indicate strong support for a unidimensional model. Similar results were demonstrated for each of the other four factors indicating unidimensional models. Conclusions Though unidimensional or 2-factor (mobility and upper extremity) models of physical functioning make up outcomes measures in the general population, the underlying structure of physical function in SCI is more complex. A 5-factor solution allows for comprehensive assessment of key domain areas of physical functioning. These results informed the structure and development of the SCI-FI measurement system of physical functioning. PMID:22609299
Non Kolmogorov Probability Models Outside Quantum Mechanics
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2009-03-01
This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, "deterministic" and "exact" theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR-chameleon experiment.
Statistical Description of Associative Memory
NASA Astrophysics Data System (ADS)
Samengo, Inés
2003-03-01
The storage of memories, in the brain, induces some kind of modification in the structural and functional properties of a neural network. Here, a few neuropsychological and neurophysiological experiments are reviewed, suggesting that the plastic changes taking place during memory storage are governed, among other things, by the correlations in the activity of a set of neurons. The Hopfield model is briefly described, showing the way the methods of statistical physics can be useful to describe the storage and retrieval of memories.
1983-04-01
34.. .. . ...- "- -,-. SIGNIFICANCE AND EXPLANATION Many different codes for the simulation of semiconductor devices such as transitors , diodes, thyristors are already circulated...partially take into account the consequences introduced by degenerate semiconductors (e.g. invalidity of Boltzmann’s statistics , bandgap narrowing). These...ft - ni p nep /Ut(2.10) Sni *e p nie 2.11) .7. (2.10) can be physically interpreted as the application of Boltzmann statistics . However (2.10) a.,zo
Resolving Structural Variability in Network Models and the Brain
Klimm, Florian; Bassett, Danielle S.; Carlson, Jean M.; Mucha, Peter J.
2014-01-01
Large-scale white matter pathways crisscrossing the cortex create a complex pattern of connectivity that underlies human cognitive function. Generative mechanisms for this architecture have been difficult to identify in part because little is known in general about mechanistic drivers of structured networks. Here we contrast network properties derived from diffusion spectrum imaging data of the human brain with 13 synthetic network models chosen to probe the roles of physical network embedding and temporal network growth. We characterize both the empirical and synthetic networks using familiar graph metrics, but presented here in a more complete statistical form, as scatter plots and distributions, to reveal the full range of variability of each measure across scales in the network. We focus specifically on the degree distribution, degree assortativity, hierarchy, topological Rentian scaling, and topological fractal scaling—in addition to several summary statistics, including the mean clustering coefficient, the shortest path-length, and the network diameter. The models are investigated in a progressive, branching sequence, aimed at capturing different elements thought to be important in the brain, and range from simple random and regular networks, to models that incorporate specific growth rules and constraints. We find that synthetic models that constrain the network nodes to be physically embedded in anatomical brain regions tend to produce distributions that are most similar to the corresponding measurements for the brain. We also find that network models hardcoded to display one network property (e.g., assortativity) do not in general simultaneously display a second (e.g., hierarchy). This relative independence of network properties suggests that multiple neurobiological mechanisms might be at play in the development of human brain network architecture. Together, the network models that we develop and employ provide a potentially useful starting point for the statistical inference of brain network structure from neuroimaging data. PMID:24675546
What can we learn from noise? — Mesoscopic nonequilibrium statistical physics —
KOBAYASHI, Kensuke
2016-01-01
Mesoscopic systems — small electric circuits working in quantum regime — offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics. PMID:27477456
What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.
Kobayashi, Kensuke
2016-01-01
Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.
Testing the Predictive Power of Coulomb Stress on Aftershock Sequences
NASA Astrophysics Data System (ADS)
Woessner, J.; Lombardi, A.; Werner, M. J.; Marzocchi, W.
2009-12-01
Empirical and statistical models of clustered seismicity are usually strongly stochastic and perceived to be uninformative in their forecasts, since only marginal distributions are used, such as the Omori-Utsu and Gutenberg-Richter laws. In contrast, so-called physics-based aftershock models, based on seismic rate changes calculated from Coulomb stress changes and rate-and-state friction, make more specific predictions: anisotropic stress shadows and multiplicative rate changes. We test the predictive power of models based on Coulomb stress changes against statistical models, including the popular Short Term Earthquake Probabilities and Epidemic-Type Aftershock Sequences models: We score and compare retrospective forecasts on the aftershock sequences of the 1992 Landers, USA, the 1997 Colfiorito, Italy, and the 2008 Selfoss, Iceland, earthquakes. To quantify predictability, we use likelihood-based metrics that test the consistency of the forecasts with the data, including modified and existing tests used in prospective forecast experiments within the Collaboratory for the Study of Earthquake Predictability (CSEP). Our results indicate that a statistical model performs best. Moreover, two Coulomb model classes seem unable to compete: Models based on deterministic Coulomb stress changes calculated from a given fault-slip model, and those based on fixed receiver faults. One model of Coulomb stress changes does perform well and sometimes outperforms the statistical models, but its predictive information is diluted, because of uncertainties included in the fault-slip model. Our results suggest that models based on Coulomb stress changes need to incorporate stochastic features that represent model and data uncertainty.
NASA Technical Reports Server (NTRS)
Cohen, C.
1981-01-01
A hierarchy of experiments was run, starting with an all water planet with zonally symmetric sea surface temperatures, then adding, one at a time, flat continents, mountains, surface physics, and realistic sea surface temperatures. The model was run with the sun fixed at a perpetual January. Ensemble means and standard deviations were computed and the t-test was used to determine the statistical significance of the results. The addition of realistic surface physics does not affect the model climatology to as large as extent as does the addition of mountains. Departures from zonal symmetry of the SST field result in a better simulation of the real atmosphere.
Error and Uncertainty Quantification in the Numerical Simulation of Complex Fluid Flows
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2010-01-01
The failure of numerical simulation to predict physical reality is often a direct consequence of the compounding effects of numerical error arising from finite-dimensional approximation and physical model uncertainty resulting from inexact knowledge and/or statistical representation. In this topical lecture, we briefly review systematic theories for quantifying numerical errors and restricted forms of model uncertainty occurring in simulations of fluid flow. A goal of this lecture is to elucidate both positive and negative aspects of applying these theories to practical fluid flow problems. Finite-element and finite-volume calculations of subsonic and hypersonic fluid flow are presented to contrast the differing roles of numerical error and model uncertainty. for these problems.
NASA Astrophysics Data System (ADS)
Zhang, Yufeng; Zhang, Xiangzhi; Wang, Yan; Liu, Jiangen
2017-01-01
With the help of R-matrix approach, we present the Toda lattice systems that have extensive applications in statistical physics and quantum physics. By constructing a new discrete integrable formula by R-matrix, the discrete expanding integrable models of the Toda lattice systems and their Lax pairs are generated, respectively. By following the constructing formula again, we obtain the corresponding (2+1)-dimensional Toda lattice systems and their Lax pairs, as well as their (2+1)-dimensional discrete expanding integrable models. Finally, some conservation laws of a (1+1)-dimensional generalised Toda lattice system and a new (2+1)-dimensional lattice system are generated, respectively.
Global Precipitation Measurement (GPM) Ground Validation (GV) Science Implementation Plan
NASA Technical Reports Server (NTRS)
Petersen, Walter A.; Hou, Arthur Y.
2008-01-01
For pre-launch algorithm development and post-launch product evaluation Global Precipitation Measurement (GPM) Ground Validation (GV) goes beyond direct comparisons of surface rain rates between ground and satellite measurements to provide the means for improving retrieval algorithms and model applications.Three approaches to GPM GV include direct statistical validation (at the surface), precipitation physics validation (in a vertical columns), and integrated science validation (4-dimensional). These three approaches support five themes: core satellite error characterization; constellation satellites validation; development of physical models of snow, cloud water, and mixed phase; development of cloud-resolving model (CRM) and land-surface models to bridge observations and algorithms; and, development of coupled CRM-land surface modeling for basin-scale water budget studies and natural hazard prediction. This presentation describes the implementation of these approaches.
Development of Turbulent Biological Closure Parameterizations
2011-09-30
LONG-TERM GOAL: The long-term goals of this project are: (1) to develop a theoretical framework to quantify turbulence induced NPZ interactions. (2) to apply the theory to develop parameterizations to be used in realistic environmental physical biological coupling numerical models. OBJECTIVES: Connect the Goodman and Robinson (2008) statistically based pdf theory to Advection Diffusion Reaction (ADR) modeling of NPZ interaction.
NASA Astrophysics Data System (ADS)
Riandry, M. A.; Ismet, I.; Akhsan, H.
2017-09-01
This study aims to produce a valid and practical statistical physics course handout on distribution function materials based on STEM. Rowntree development model is used to produce this handout. The model consists of three stages: planning, development and evaluation stages. In this study, the evaluation stage used Tessmer formative evaluation. It consists of 5 stages: self-evaluation, expert review, one-to-one evaluation, small group evaluation and field test stages. However, the handout is limited to be tested on validity and practicality aspects, so the field test stage is not implemented. The data collection technique used walkthroughs and questionnaires. Subjects of this study are students of 6th and 8th semester of academic year 2016/2017 Physics Education Study Program of Sriwijaya University. The average result of expert review is 87.31% (very valid category). One-to-one evaluation obtained the average result is 89.42%. The result of small group evaluation is 85.92%. From one-to-one and small group evaluation stages, averagestudent response to this handout is 87,67% (very practical category). Based on the results of the study, it can be concluded that the handout is valid and practical.
Kim, Jinhyuk; Nakamura, Toru; Kikuchi, Hiroe; Yoshiuchi, Kazuhiro; Sasaki, Tsukasa; Yamamoto, Yoshiharu
2015-07-01
The objective evaluation of depressive mood is considered to be useful for the diagnosis and treatment of depressive disorders. Thus, we investigated psychobehavioral correlates, particularly the statistical associations between momentary depressive mood and behavioral dynamics measured objectively, in patients with major depressive disorder (MDD) and healthy subjects. Patients with MDD ( n = 14) and healthy subjects ( n = 43) wore a watch-type computer device and rated their momentary symptoms using ecological momentary assessment. Spontaneous physical activity in daily life, referred to as locomotor activity, was also continuously measured by an activity monitor built into the device. A multilevel modeling approach was used to model the associations between changes in depressive mood scores and the local statistics of locomotor activity simultaneously measured. We further examined the cross validity of such associations across groups. The statistical model established indicated that worsening of the depressive mood was associated with the increased intermittency of locomotor activity, as characterized by a lower mean and higher skewness. The model was cross validated across groups, suggesting that the same psychobehavioral correlates are shared by both healthy subjects and patients, although the latter had significantly higher mean levels of depressive mood scores. Our findings suggest the presence of robust as well as common associations between momentary depressive mood and behavioral dynamics in healthy individuals and patients with depression, which may lead to the continuous monitoring of the pathogenic processes (from healthy states) and pathological states of MDD.
Cross-Section Measurements via the Activation Technique at the Cologne Clover Counting Setup
NASA Astrophysics Data System (ADS)
Heim, Felix; Mayer, Jan; Netterdon, Lars; Scholz, Philipp; Zilges, Andreas
The activation technique is a widely used method for the determination of cross-section values for charged-particle induced reactions at astrophysically relevant energies. Since network calculations of nucleosynthesis processes often depend on reaction rates calculated in the scope of the Hauser-Feshbach statistical model, these cross-sections can be used to improve the nuclear-physics input-parameters like optical-model potentials (OMP), γ-ray strength functions, and nuclear level densities. In order to extend the available experimental database, the 108Cd(α, n)111Sn reaction cross section was investigated at ten energies between 10.2 and 13.5 MeV. As this reaction at these energies is almost only sensitive on the α-decay width, the results were compared to statistical model calculations using different models for the α-OMP. The irradiation as well as the consecutive γ-ray counting were performed at the Institute for Nuclear Physics of the University of Cologne using the 10 MV FN-Tandem accelerator and the Cologne Clover Counting Setup. This setup consists of two clover- type high purity germanium (HPGe) detectors in a close face-to-face geometry to cover a solid angle of almost 4π.
Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.
2016-01-01
Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.
Inference from the small scales of cosmic shear with current and future Dark Energy Survey data
MacCrann, N.; Aleksić, J.; Amara, A.; ...
2016-11-05
Cosmic shear is sensitive to fluctuations in the cosmological matter density field, including on small physical scales, where matter clustering is affected by baryonic physics in galaxies and galaxy clusters, such as star formation, supernovae feedback and AGN feedback. While muddying any cosmological information that is contained in small scale cosmic shear measurements, this does mean that cosmic shear has the potential to constrain baryonic physics and galaxy formation. We perform an analysis of the Dark Energy Survey (DES) Science Verification (SV) cosmic shear measurements, now extended to smaller scales, and using the Mead et al. 2015 halo model tomore » account for baryonic feedback. While the SV data has limited statistical power, we demonstrate using a simulated likelihood analysis that the final DES data will have the statistical power to differentiate among baryonic feedback scenarios. We also explore some of the difficulties in interpreting the small scales in cosmic shear measurements, presenting estimates of the size of several other systematic effects that make inference from small scales difficult, including uncertainty in the modelling of intrinsic alignment on nonlinear scales, `lensing bias', and shape measurement selection effects. For the latter two, we make use of novel image simulations. While future cosmic shear datasets have the statistical power to constrain baryonic feedback scenarios, there are several systematic effects that require improved treatments, in order to make robust conclusions about baryonic feedback.« less
Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model
NASA Astrophysics Data System (ADS)
Advani, Madhu; Bunin, Guy; Mehta, Pankaj
2018-03-01
A central question in ecology is to understand the ecological processes that shape community structure. Niche-based theories have emphasized the important role played by competition for maintaining species diversity. Many of these insights have been derived using MacArthur’s consumer resource model (MCRM) or its generalizations. Most theoretical work on the MCRM has focused on small ecosystems with a few species and resources. However theoretical insights derived from small ecosystems many not scale up to large ecosystems with many resources and species because large systems with many interacting components often display new emergent behaviors that cannot be understood or deduced from analyzing smaller systems. To address these shortcomings, we develop a statistical physics inspired cavity method to analyze MCRM when both the number of species and the number of resources is large. Unlike previous work in this limit, our theory addresses resource dynamics and resource depletion and demonstrates that species generically and consistently perturb their environments and significantly modify available ecological niches. We show how our cavity approach naturally generalizes niche theory to large ecosystems by accounting for the effect of collective phenomena on species invasion and ecological stability. Our theory suggests that such phenomena are a generic feature of large, natural ecosystems and must be taken into account when analyzing and interpreting community structure. It also highlights the important role that statistical-physics inspired approaches can play in furthering our understanding of ecology.
Physics-based statistical learning approach to mesoscopic model selection.
Taverniers, Søren; Haut, Terry S; Barros, Kipton; Alexander, Francis J; Lookman, Turab
2015-11-01
In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.
Appplication of statistical mechanical methods to the modeling of social networks
NASA Astrophysics Data System (ADS)
Strathman, Anthony Robert
With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.
NASA Astrophysics Data System (ADS)
Li, S.; Rupp, D. E.; Hawkins, L.; Mote, P.; McNeall, D. J.; Sarah, S.; Wallom, D.; Betts, R. A.
2017-12-01
This study investigates the potential to reduce known summer hot/dry biases over Pacific Northwest in the UK Met Office's atmospheric model (HadAM3P) by simultaneously varying multiple model parameters. The bias-reduction process is done through a series of steps: 1) Generation of perturbed physics ensemble (PPE) through the volunteer computing network weather@home; 2) Using machine learning to train "cheap" and fast statistical emulators of climate model, to rule out regions of parameter spaces that lead to model variants that do not satisfy observational constraints, where the observational constraints (e.g., top-of-atmosphere energy flux, magnitude of annual temperature cycle, summer/winter temperature and precipitation) are introduced sequentially; 3) Designing a new PPE by "pre-filtering" using the emulator results. Steps 1) through 3) are repeated until results are considered to be satisfactory (3 times in our case). The process includes a sensitivity analysis to find dominant parameters for various model output metrics, which reduces the number of parameters to be perturbed with each new PPE. Relative to observational uncertainty, we achieve regional improvements without introducing large biases in other parts of the globe. Our results illustrate the potential of using machine learning to train cheap and fast statistical emulators of climate model, in combination with PPEs in systematic model improvement.
Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error
NASA Astrophysics Data System (ADS)
Miller, Austin
In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.
Daikoku, Tatsuya; Takahashi, Yuji; Futagami, Hiroko; Tarumoto, Nagayoshi; Yasuda, Hideki
2017-02-01
In real-world auditory environments, humans are exposed to overlapping auditory information such as those made by human voices and musical instruments even during routine physical activities such as walking and cycling. The present study investigated how concurrent physical exercise affects performance of incidental and intentional learning of overlapping auditory streams, and whether physical fitness modulates the performances of learning. Participants were grouped with 11 participants with lower and higher fitness each, based on their Vo 2 max value. They were presented simultaneous auditory sequences with a distinct statistical regularity each other (i.e. statistical learning), while they were pedaling on the bike and seating on a bike at rest. In experiment 1, they were instructed to attend to one of the two sequences and ignore to the other sequence. In experiment 2, they were instructed to attend to both of the two sequences. After exposure to the sequences, learning effects were evaluated by familiarity test. In the experiment 1, performance of statistical learning of ignored sequences during concurrent pedaling could be higher in the participants with high than low physical fitness, whereas in attended sequence, there was no significant difference in performance of statistical learning between high than low physical fitness. Furthermore, there was no significant effect of physical fitness on learning while resting. In the experiment 2, the both participants with high and low physical fitness could perform intentional statistical learning of two simultaneous sequences in the both exercise and rest sessions. The improvement in physical fitness might facilitate incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.
NASA Astrophysics Data System (ADS)
2015-01-01
The third International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place at Madrid, Spain, from Thursday 28 to Sunday 31 August 2014. The Conference was attended by more than 200 participants and hosted about 350 oral, poster, and virtual presentations. More than 600 pre-registered authors were also counted. The third IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics etc. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, three parallel oral sessions and one poster session were running every day. However, according to all attendees, the program was excellent with high level of talks and the scientific environment was fruitful, thus all attendees had a creative time. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee.
PREFACE: 4th International Conference on Mathematical Modeling in Physical Sciences (IC-MSquare2015)
NASA Astrophysics Data System (ADS)
Vlachos, Dimitrios; Vagenas, Elias C.
2015-09-01
The 4th International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place in Mykonos, Greece, from Friday 5th June to Monday 8th June 2015. The Conference was attended by more than 150 participants and hosted about 200 oral, poster, and virtual presentations. There were more than 600 pre-registered authors. The 4th IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics etc. The scientific program was rather intense as after the Keynote and Invited Talks in the morning, three parallel oral and one poster session were running every day. However, according to all attendees, the program was excellent with a high quality of talks creating an innovative and productive scientific environment for all attendees. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee.
NASA Astrophysics Data System (ADS)
Barbaro, Alethea
2015-03-01
Agent-based models have been widely applied in theoretical ecology to explain migrations and other collective animal movements [2,5,8]. As D'Orsogna and Perc have expertly highlighted in [6], the recent emergence of crime modeling has opened another interesting avenue for mathematical investigation. The area of crime modeling is particularly suited to agent-based models, because these models offer a great deal of flexibility within the model and also ease of communication among criminologist, law enforcement and modelers.
A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume
NASA Astrophysics Data System (ADS)
Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration
2017-11-01
An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.
NASA Astrophysics Data System (ADS)
Bourke, Sarah A.; Hermann, Kristian J.; Hendry, M. Jim
2017-11-01
Elevated groundwater salinity associated with produced water, leaching from landfills or secondary salinity can degrade arable soils and potable water resources. Direct-push electrical conductivity (EC) profiling enables rapid, relatively inexpensive, high-resolution in-situ measurements of subsurface salinity, without requiring core collection or installation of groundwater wells. However, because the direct-push tool measures the bulk EC of both solid and liquid phases (ECa), incorporation of ECa data into regional or historical groundwater data sets requires the prediction of pore water EC (ECw) or chloride (Cl-) concentrations from measured ECa. Statistical linear regression and physically based models for predicting ECw and Cl- from ECa profiles were tested on a brine plume in central Saskatchewan, Canada. A linear relationship between ECa/ECw and porosity was more accurate for predicting ECw and Cl- concentrations than a power-law relationship (Archie's Law). Despite clay contents of up to 96%, the addition of terms to account for electrical conductance in the solid phase did not improve model predictions. In the absence of porosity data, statistical linear regression models adequately predicted ECw and Cl- concentrations from direct-push ECa profiles (ECw = 5.48 ECa + 0.78, R 2 = 0.87; Cl- = 1,978 ECa - 1,398, R 2 = 0.73). These statistical models can be used to predict ECw in the absence of lithologic data and will be particularly useful for initial site assessments. The more accurate linear physically based model can be used to predict ECw and Cl- as porosity data become available and the site-specific ECw-Cl- relationship is determined.
Takasaki, Hiroshi; Treleaven, Julia; Johnston, Venerina; Jull, Gwendolen
2013-08-15
Cross-sectional. To conduct a preliminary analysis of the physical, cognitive, and psychological domains contributing to self-reported driving difficulty after adjusting for neck pain, dizziness, and relevant demographics in chronic whiplash-associated disorders (WAD) using hierarchical regression modeling. Pain is a risk factor for car crashes, and dizziness may affect fitness to drive. Both symptoms are common in chronic WAD and difficulty driving is a common complaint in this group. Chronic WAD is often accompanied by physical, cognitive, and psychological impairments. These impairments may contribute to self-reported driving difficulty beyond neck pain, dizziness, and relevant demographics. Forty individuals with chronic WAD participated. Dependent variables were the magnitude of self-reported driving difficulty assessed in the strategic, tactical, and operational levels of the Neck Pain Driving Index. Three models were developed to assess the contributions of independent variables (physical, cognitive, and psychological domains) to each of the 3 dependent variables after adjusting for neck pain intensity, dizziness, and driving demographics. The measures included were: physical domain-range and maximum speed of head rotation, performances during gaze stability, eye-head coordination, and visual dependency tests; cognitive domain-self-reported cognitive symptoms including fatigue and the trail making tests; and psychological domain-general stress, traumatic stress, depression, and fear of neck movements and driving. Symptom duration was relevant to driving difficulty in the strategic and tactical levels. The cognitive domain increased statistical power to estimate the strategic and operational levels (P < 0.1) beyond other contributors. The physical domain increased statistical power to estimate the tactical level (P < 0.1) beyond other contributors. Physical and cognitive impairments independently contributed to self-reported driving difficulty in chronic WAD beyond neck pain, dizziness, and symptom duration. 3.
Detector noise statistics in the non-linear regime
NASA Technical Reports Server (NTRS)
Shopbell, P. L.; Bland-Hawthorn, J.
1992-01-01
The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.
Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output
NASA Astrophysics Data System (ADS)
Milroy, D.; Hammerling, D.; Baker, A. H.
2017-12-01
Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.
Gupta, C K; Mishra, G; Mehta, S C; Prasad, J
1993-01-01
Lung volumes, capacities, diffusion and alveolar volumes with physical characteristics (age, height and weight) were recorded for 186 healthy school children (96 boys and 90 girls) of 10-17 years age group. The objective was to study the relative importance of physical characteristics as regressor variables in regression models to estimate lung functions. We observed that height is best correlated with all the lung functions. Inclusion of all physical characteristics in the models have little gain compared to the ones having just height as regressor variable. We also find that exponential models were not only statistically valid but fared better compared to the linear ones. We conclude that lung functions covary with height and other physical characteristics but do not depend upon them. The rate of increase in the functions depend upon initial lung functions. Further, we propose models and provide ready reckoners to give estimates of lung functions with 95 per cent confidence limits based on heights from 125 to 170 cm for the age group of 10 to 17 years.
Gupta, Nidhi; Heiden, Marina; Mathiassen, Svend Erik; Holtermann, Andreas
2016-05-01
We aimed at developing and evaluating statistical models predicting objectively measured occupational time spent sedentary or in physical activity from self-reported information available in large epidemiological studies and surveys. Two-hundred-and-fourteen blue-collar workers responded to a questionnaire containing information about personal and work related variables, available in most large epidemiological studies and surveys. Workers also wore accelerometers for 1-4 days measuring time spent sedentary and in physical activity, defined as non-sedentary time. Least-squares linear regression models were developed, predicting objectively measured exposures from selected predictors in the questionnaire. A full prediction model based on age, gender, body mass index, job group, self-reported occupational physical activity (OPA), and self-reported occupational sedentary time (OST) explained 63% (R (2)adjusted) of the variance of both objectively measured time spent sedentary and in physical activity since these two exposures were complementary. Single-predictor models based only on self-reported information about either OPA or OST explained 21% and 38%, respectively, of the variance of the objectively measured exposures. Internal validation using bootstrapping suggested that the full and single-predictor models would show almost the same performance in new datasets as in that used for modelling. Both full and single-predictor models based on self-reported information typically available in most large epidemiological studies and surveys were able to predict objectively measured occupational time spent sedentary or in physical activity, with explained variances ranging from 21-63%.
A New Approach to Monte Carlo Simulations in Statistical Physics
NASA Astrophysics Data System (ADS)
Landau, David P.
2002-08-01
Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).
NASA Astrophysics Data System (ADS)
Skitka, J.; Marston, B.; Fox-Kemper, B.
2016-02-01
Sub-grid turbulence models for planetary boundary layers are typically constructed additively, starting with local flow properties and including non-local (KPP) or higher order (Mellor-Yamada) parameters until a desired level of predictive capacity is achieved or a manageable threshold of complexity is surpassed. Such approaches are necessarily limited in general circumstances, like global circulation models, by their being optimized for particular flow phenomena. By building a model reductively, starting with the infinite hierarchy of turbulence statistics, truncating at a given order, and stripping degrees of freedom from the flow, we offer the prospect a turbulence model and investigative tool that is equally applicable to all flow types and able to take full advantage of the wealth of nonlocal information in any flow. Direct statistical simulation (DSS) that is based upon expansion in equal-time cumulants can be used to compute flow statistics of arbitrary order. We investigate the feasibility of a second-order closure (CE2) by performing simulations of the ocean boundary layer in a quasi-linear approximation for which CE2 is exact. As oceanographic examples, wind-driven Langmuir turbulence and thermal convection are studied by comparison of the quasi-linear and fully nonlinear statistics. We also characterize the computational advantages and physical uncertainties of CE2 defined on a reduced basis determined via proper orthogonal decomposition (POD) of the flow fields.
Volatility behavior of visibility graph EMD financial time series from Ising interacting system
NASA Astrophysics Data System (ADS)
Zhang, Bo; Wang, Jun; Fang, Wen
2015-08-01
A financial market dynamics model is developed and investigated by stochastic Ising system, where the Ising model is the most popular ferromagnetic model in statistical physics systems. Applying two graph based analysis and multiscale entropy method, we investigate and compare the statistical volatility behavior of return time series and the corresponding IMF series derived from the empirical mode decomposition (EMD) method. And the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, we find that the degree distribution of visibility graph for the simulation series has the power law tails, and the assortative network exhibits the mixing pattern property. All these features are in agreement with the real market data, the research confirms that the financial model established by the Ising system is reasonable.
McDonough, Christine M.; Jette, Alan M.; Ni, Pengsheng; Bogusz, Kara; Marfeo, Elizabeth E; Brandt, Diane E; Chan, Leighton; Meterko, Mark; Haley, Stephen M.; Rasch, Elizabeth K.
2014-01-01
Objectives To build a comprehensive item pool representing work-relevant physical functioning and to test the factor structure of the item pool. These developmental steps represent initial outcomes of a broader project to develop instruments for the assessment of function within the context of Social Security Administration (SSA) disability programs. Design Comprehensive literature review; gap analysis; item generation with expert panel input; stakeholder interviews; cognitive interviews; cross-sectional survey administration; and exploratory and confirmatory factor analyses to assess item pool structure. Setting In-person and semi-structured interviews; internet and telephone surveys. Participants A sample of 1,017 SSA claimants, and a normative sample of 999 adults from the US general population. Interventions Not Applicable. Main Outcome Measure Model fit statistics Results The final item pool consisted of 139 items. Within the claimant sample 58.7% were white; 31.8% were black; 46.6% were female; and the mean age was 49.7 years. Initial factor analyses revealed a 4-factor solution which included more items and allowed separate characterization of: 1) Changing and Maintaining Body Position, 2) Whole Body Mobility, 3) Upper Body Function and 4) Upper Extremity Fine Motor. The final 4-factor model included 91 items. Confirmatory factor analyses for the 4-factor models for the claimant and the normative samples demonstrated very good fit. Fit statistics for claimant and normative samples respectively were: Comparative Fit Index = 0.93 and 0.98; Tucker-Lewis Index = 0.92 and 0.98; Root Mean Square Error Approximation = 0.05 and 0.04. Conclusions The factor structure of the Physical Function item pool closely resembled the hypothesized content model. The four scales relevant to work activities offer promise for providing reliable information about claimant physical functioning relevant to work disability. PMID:23542402
McDonough, Christine M; Jette, Alan M; Ni, Pengsheng; Bogusz, Kara; Marfeo, Elizabeth E; Brandt, Diane E; Chan, Leighton; Meterko, Mark; Haley, Stephen M; Rasch, Elizabeth K
2013-09-01
To build a comprehensive item pool representing work-relevant physical functioning and to test the factor structure of the item pool. These developmental steps represent initial outcomes of a broader project to develop instruments for the assessment of function within the context of Social Security Administration (SSA) disability programs. Comprehensive literature review; gap analysis; item generation with expert panel input; stakeholder interviews; cognitive interviews; cross-sectional survey administration; and exploratory and confirmatory factor analyses to assess item pool structure. In-person and semistructured interviews and Internet and telephone surveys. Sample of SSA claimants (n=1017) and a normative sample of adults from the U.S. general population (n=999). Not applicable. Model fit statistics. The final item pool consisted of 139 items. Within the claimant sample, 58.7% were white; 31.8% were black; 46.6% were women; and the mean age was 49.7 years. Initial factor analyses revealed a 4-factor solution, which included more items and allowed separate characterization of: (1) changing and maintaining body position, (2) whole body mobility, (3) upper body function, and (4) upper extremity fine motor. The final 4-factor model included 91 items. Confirmatory factor analyses for the 4-factor models for the claimant and the normative samples demonstrated very good fit. Fit statistics for claimant and normative samples, respectively, were: Comparative Fit Index=.93 and .98; Tucker-Lewis Index=.92 and .98; and root mean square error approximation=.05 and .04. The factor structure of the physical function item pool closely resembled the hypothesized content model. The 4 scales relevant to work activities offer promise for providing reliable information about claimant physical functioning relevant to work disability. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chakraborty, Arup
No medical procedure has saved more lives than vaccination. But, today, some pathogens have evolved which have defied successful vaccination using the empirical paradigms pioneered by Pasteur and Jenner. One characteristic of many pathogens for which successful vaccines do not exist is that they present themselves in various guises. HIV is an extreme example because of its high mutability. This highly mutable virus can evade natural or vaccine induced immune responses, often by mutating at multiple sites linked by compensatory interactions. I will describe first how by bringing to bear ideas from statistical physics (e.g., maximum entropy models, Hopfield models, Feynman variational theory) together with in vitro experiments and clinical data, the fitness landscape of HIV is beginning to be defined with explicit account for collective mutational pathways. I will describe how this knowledge can be harnessed for vaccine design. Finally, I will describe how ideas at the intersection of evolutionary biology, immunology, and statistical physics can help guide the design of strategies that may be able to induce broadly neutralizing antibodies.
Changes in physical activity, self-efficacy and depressive symptoms in adolescent girls.
Neissaar, Inga; Raudsepp, Lennart
2011-08-01
The purpose of this study was to examine the longitudinal relationships between naturally occurring changes in leisure-time physical activity, depressive symptoms and self-efficacy in adolescent girls. We also aimed to test whether depressive symptoms would moderate the self-efficacy-physical activity relationship. Participants were 181 urban adolescent girls. Physical activity was measured using the 3-Day Physical Activity Recall. Self-efficacy and depressive symptoms were assessed using questionnaires. Body height and body mass were measured and body mass index (BMI) was calculated. Data were collected on three occasions over a 2-year period. There was a decrease in physical activity and self-efficacy and increase in depressive symptoms across three measurement occasions. There were statistically significant and negative relationships between initial level and change for physical activity and depressive symptoms. Initially higher levels of physical activity were related with initially lower levels of depressive symptoms, and change in physical activity across time was inversely associated with change in levels of depressive symptoms across measurements. There were statistically significant and positive relationships between initial level and change for physical activity and self-efficacy after controlling effect of BMI. Latent growth modeling (LGM) also indicated a moderating effect of depressive symptoms on the self-efficacy-physical activity relationship. Girls who had high initial levels of self-efficacy and smaller increases in depressive symptoms had the lowest decline in physical activity participation. Our results encourage the design of interventions that reduce depressive symptoms and increase self-efficacy as a possible of means of increasing adolescent girls' physical activity.
Rain rate duration statistics derived from the Mid-Atlantic coast rain gauge network
NASA Technical Reports Server (NTRS)
Goldhirsh, Julius
1993-01-01
A rain gauge network comprised of 10 tipping bucket rain gauges located in the Mid-Atlantic coast of the United States has been in continuous operation since June 1, 1986. Rain rate distributions and estimated slant path fade distributions at 20 GHz and 30 GHz covering the first five year period were derived from the gauge network measurements, and these results were described by Goldhirsh. In this effort, rain rate time duration statistics are presented. The rain duration statistics are of interest for better understanding the physical nature of precipitation and to present a data base which may be used by modelers to convert to slant path fade duration statistics. Such statistics are important for better assessing optimal coding procedures over defined bandwidths.
Record statistics of financial time series and geometric random walks
NASA Astrophysics Data System (ADS)
Sabir, Behlool; Santhanam, M. S.
2014-09-01
The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.
NASA Astrophysics Data System (ADS)
Vallianatos, F.; Tzanis, A.; Michas, G.; Papadakis, G.
2012-04-01
Since the middle of summer 2011, an increase in the seismicity rates of the volcanic complex system of Santorini Island, Greece, was observed. In the present work, the temporal distribution of seismicity, as well as the magnitude distribution of earthquakes, have been studied using the concept of Non-Extensive Statistical Physics (NESP; Tsallis, 2009) along with the evolution of Shanon entropy H (also called information entropy). The analysis is based on the earthquake catalogue of the Geodynamic Institute of the National Observatory of Athens for the period July 2011-January 2012 (http://www.gein.noa.gr/). Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems. The observed distributions of seismicity rates at Santorini can be described (fitted) with NESP models to exceptionally well. This implies the inherent complexity of the Santorini volcanic seismicity, the applicability of NESP concepts to volcanic earthquake activity and the usefulness of NESP in investigating phenomena exhibiting multifractality and long-range coupling effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).
A new statistical method for characterizing the atmospheres of extrasolar planets
NASA Astrophysics Data System (ADS)
Henderson, Cassandra S.; Skemer, Andrew J.; Morley, Caroline V.; Fortney, Jonathan J.
2017-10-01
By detecting light from extrasolar planets, we can measure their compositions and bulk physical properties. The technologies used to make these measurements are still in their infancy, and a lack of self-consistency suggests that previous observations have underestimated their systemic errors. We demonstrate a statistical method, newly applied to exoplanet characterization, which uses a Bayesian formalism to account for underestimated errorbars. We use this method to compare photometry of a substellar companion, GJ 758b, with custom atmospheric models. Our method produces a probability distribution of atmospheric model parameters including temperature, gravity, cloud model (fsed) and chemical abundance for GJ 758b. This distribution is less sensitive to highly variant data and appropriately reflects a greater uncertainty on parameter fits.
Statistical mechanics of competitive resource allocation using agent-based models
NASA Astrophysics Data System (ADS)
Chakraborti, Anirban; Challet, Damien; Chatterjee, Arnab; Marsili, Matteo; Zhang, Yi-Cheng; Chakrabarti, Bikas K.
2015-01-01
Demand outstrips available resources in most situations, which gives rise to competition, interaction and learning. In this article, we review a broad spectrum of multi-agent models of competition (El Farol Bar problem, Minority Game, Kolkata Paise Restaurant problem, Stable marriage problem, Parking space problem and others) and the methods used to understand them analytically. We emphasize the power of concepts and tools from statistical mechanics to understand and explain fully collective phenomena such as phase transitions and long memory, and the mapping between agent heterogeneity and physical disorder. As these methods can be applied to any large-scale model of competitive resource allocation made up of heterogeneous adaptive agent with non-linear interaction, they provide a prospective unifying paradigm for many scientific disciplines.
Souza, Michele; Eisenmann, Joey; Chaves, Raquel; Santos, Daniel; Pereira, Sara; Forjaz, Cláudia; Maia, José
2016-10-01
In this paper, three different statistical approaches were used to investigate short-term tracking of cardiorespiratory and performance-related physical fitness among adolescents. Data were obtained from the Oporto Growth, Health and Performance Study and comprised 1203 adolescents (549 girls) divided into two age cohorts (10-12 and 12-14 years) followed for three consecutive years, with annual assessment. Cardiorespiratory fitness was assessed with 1-mile run/walk test; 50-yard dash, standing long jump, handgrip, and shuttle run test were used to rate performance-related physical fitness. Tracking was expressed in three different ways: auto-correlations, multilevel modelling with crude and adjusted model (for biological maturation, body mass index, and physical activity), and Cohen's Kappa (κ) computed in IBM SPSS 20.0, HLM 7.01 and Longitudinal Data Analysis software, respectively. Tracking of physical fitness components was (1) moderate-to-high when described by auto-correlations; (2) low-to-moderate when crude and adjusted models were used; and (3) low according to Cohen's Kappa (κ). These results demonstrate that when describing tracking, different methods should be considered since they provide distinct and more comprehensive views about physical fitness stability patterns.
NASA Astrophysics Data System (ADS)
Peck, Myron A.; Arvanitidis, Christos; Butenschön, Momme; Canu, Donata Melaku; Chatzinikolaou, Eva; Cucco, Andrea; Domenici, Paolo; Fernandes, Jose A.; Gasche, Loic; Huebert, Klaus B.; Hufnagl, Marc; Jones, Miranda C.; Kempf, Alexander; Keyl, Friedemann; Maar, Marie; Mahévas, Stéphanie; Marchal, Paul; Nicolas, Delphine; Pinnegar, John K.; Rivot, Etienne; Rochette, Sébastien; Sell, Anne F.; Sinerchia, Matteo; Solidoro, Cosimo; Somerfield, Paul J.; Teal, Lorna R.; Travers-Trolet, Morgan; van de Wolfshaar, Karen E.
2018-02-01
We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.
NASA Astrophysics Data System (ADS)
Vallianatos, Filippos; Kouli, Maria
2013-08-01
The Digital Elevation Model (DEM) for the Crete Island with a resolution of approximately 20 meters was used in order to delineate watersheds by computing the flow direction and using it in the Watershed function. The Watershed function uses a raster of flow direction to determine contributing area. The Geographic Information Systems routine procedure was applied and the watersheds as well as the streams network (using a threshold of 2000 cells, i.e. the minimum number of cells that constitute a stream) were extracted from the hydrologically corrected (free of sinks) DEM. A number of a few thousand watersheds were delineated, and their areal extent was calculated. From these watersheds a number of 300 was finally selected for further analysis as the watersheds of extremely small area were excluded in order to avoid possible artifacts. Our analysis approach is based on the basic principles of Complexity theory and Tsallis Entropy introduces in the frame of non-extensive statistical physics. This concept has been successfully used for the analysis of a variety of complex dynamic systems including natural hazards, where fractality and long-range interactions are important. The analysis indicates that the statistical distribution of watersheds can be successfully described with the theoretical estimations of non-extensive statistical physics implying the complexity that characterizes the occurrences of them.
Bastolla, Ugo
2014-01-01
The properties of biomolecules depend both on physics and on the evolutionary process that formed them. These two points of view produce a powerful synergism. Physics sets the stage and the constraints that molecular evolution has to obey, and evolutionary theory helps in rationalizing the physical properties of biomolecules, including protein folding thermodynamics. To complete the parallelism, protein thermodynamics is founded on the statistical mechanics in the space of protein structures, and molecular evolution can be viewed as statistical mechanics in the space of protein sequences. In this review, we will integrate both points of view, applying them to detecting selection on the stability of the folded state of proteins. We will start discussing positive design, which strengthens the stability of the folded against the unfolded state of proteins. Positive design justifies why statistical potentials for protein folding can be obtained from the frequencies of structural motifs. Stability against unfolding is easier to achieve for longer proteins. On the contrary, negative design, which consists in destabilizing frequently formed misfolded conformations, is more difficult to achieve for longer proteins. The folding rate can be enhanced by strengthening short-range native interactions, but this requirement contrasts with negative design, and evolution has to trade-off between them. Finally, selection can accelerate functional movements by favoring low frequency normal modes of the dynamics of the native state that strongly correlate with the functional conformation change. PMID:24970217
How to assess the impact of a physical parameterization in simulations of moist convection?
NASA Astrophysics Data System (ADS)
Grabowski, Wojciech
2017-04-01
A numerical model capable in simulating moist convection (e.g., cloud-resolving model or large-eddy simulation model) consists of a fluid flow solver combined with required representations (i.e., parameterizations) of physical processes. The later typically include cloud microphysics, radiative transfer, and unresolved turbulent transport. Traditional approaches to investigate impacts of such parameterizations on convective dynamics involve parallel simulations with different parameterization schemes or with different scheme parameters. Such methodologies are not reliable because of the natural variability of a cloud field that is affected by the feedback between the physics and dynamics. For instance, changing the cloud microphysics typically leads to a different realization of the cloud-scale flow, and separating dynamical and microphysical impacts is difficult. This presentation will present a novel modeling methodology, the piggybacking, that allows studying the impact of a physical parameterization on cloud dynamics with confidence. The focus will be on the impact of cloud microphysics parameterization. Specific examples of the piggybacking approach will include simulations concerning the hypothesized deep convection invigoration in polluted environments, the validity of the saturation adjustment in modeling condensation in moist convection, and separation of physical impacts from statistical uncertainty in simulations applying particle-based Lagrangian microphysics, the super-droplet method.
NASA Astrophysics Data System (ADS)
Zhang, Weijia; Fuller, Robert G.
1998-05-01
A demographic database for the 139 Nobel prize winners in physics from 1901 to 1990 has been created from a variety of sources. The results of our statistical study are discussed in the light of the implications for physics teaching.
Statistics of Radial Ship Extent as Seen by a Seeker
2014-06-01
Auckland in pure and applied mathematics and physics, and a Master of Science in physics from the same university with a thesis in applied accelerator...does not demand contributions from two angle bins to one extent bin, unlike the rectangle; this is a very big advantage of the ellipse model. However...waveform that mimics the full length of a ship. This allows more economical use to be made of available false-target generation resources. I wish to
Statistical Analysis of Physiological Signals
NASA Astrophysics Data System (ADS)
Ruiz, María G.; Pérez, Leticia
2003-07-01
In spite of two hundred years of clinical practice, Homeopathy still lacks of scientific basis. Its fundamental laws, similia principle and the activity of the denominated ultra-high dilutions are controversial issues that do not fit into the mainstream medicine or current physical-chemistry field as well. Aside its clinical efficacy, the identification of physical - chemistry parameters, as markers of the homeopathic effect, would allow to construct mathematic models [1], which in turn, could provide clues regarding the involved mechanism.
Space-time-modulated stochastic processes
NASA Astrophysics Data System (ADS)
Giona, Massimiliano
2017-10-01
Starting from the physical problem associated with the Lorentzian transformation of a Poisson-Kac process in inertial frames, the concept of space-time-modulated stochastic processes is introduced for processes possessing finite propagation velocity. This class of stochastic processes provides a two-way coupling between the stochastic perturbation acting on a physical observable and the evolution of the physical observable itself, which in turn influences the statistical properties of the stochastic perturbation during its evolution. The definition of space-time-modulated processes requires the introduction of two functions: a nonlinear amplitude modulation, controlling the intensity of the stochastic perturbation, and a time-horizon function, which modulates its statistical properties, providing irreducible feedback between the stochastic perturbation and the physical observable influenced by it. The latter property is the peculiar fingerprint of this class of models that makes them suitable for extension to generic curved-space times. Considering Poisson-Kac processes as prototypical examples of stochastic processes possessing finite propagation velocity, the balance equations for the probability density functions associated with their space-time modulations are derived. Several examples highlighting the peculiarities of space-time-modulated processes are thoroughly analyzed.
Mutual information, neural networks and the renormalization group
NASA Astrophysics Data System (ADS)
Koch-Janusz, Maciej; Ringel, Zohar
2018-06-01
Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the powerful renormalization group (RG) procedure, which systematically retains `slow' degrees of freedom and integrates out the rest. However, the important degrees of freedom may be difficult to identify. Here we demonstrate a machine-learning algorithm capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We introduce an artificial neural network based on a model-independent, information-theoretic characterization of a real-space RG procedure, which performs this task. We apply the algorithm to classical statistical physics problems in one and two dimensions. We demonstrate RG flow and extract the Ising critical exponent. Our results demonstrate that machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.
Transforming the advanced lab: Part I - Learning goals
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin; Finkelstein, Noah; Lewandowski, H. J.
2012-02-01
Within the physics education research community relatively little attention has been given to laboratory courses, especially at the upper-division undergraduate level. As part of transforming our senior-level Optics and Modern Physics Lab at the University of Colorado Boulder we are developing learning goals, revising curricula, and creating assessments. In this paper, we report on the establishment of our learning goals and a surrounding framework that have emerged from discussions with a wide variety of faculty, from a review of the literature on labs, and from identifying the goals of existing lab courses. Our goals go beyond those of specific physics content and apparatus, allowing instructors to personalize them to their contexts. We report on four broad themes and associated learning goals: Modeling (math-physics-data connection, statistical error analysis, systematic error, modeling of engineered "black boxes"), Design (of experiments, apparatus, programs, troubleshooting), Communication, and Technical Lab Skills (computer-aided data analysis, LabVIEW, test and measurement equipment).
Introducing Multisensor Satellite Radiance-Based Evaluation for Regional Earth System Modeling
NASA Technical Reports Server (NTRS)
Matsui, T.; Santanello, J.; Shi, J. J.; Tao, W.-K.; Wu, D.; Peters-Lidard, C.; Kemp, E.; Chin, M.; Starr, D.; Sekiguchi, M.;
2014-01-01
Earth System modeling has become more complex, and its evaluation using satellite data has also become more difficult due to model and data diversity. Therefore, the fundamental methodology of using satellite direct measurements with instrumental simulators should be addressed especially for modeling community members lacking a solid background of radiative transfer and scattering theory. This manuscript introduces principles of multisatellite, multisensor radiance-based evaluation methods for a fully coupled regional Earth System model: NASA-Unified Weather Research and Forecasting (NU-WRF) model. We use a NU-WRF case study simulation over West Africa as an example of evaluating aerosol-cloud-precipitation-land processes with various satellite observations. NU-WRF-simulated geophysical parameters are converted to the satellite-observable raw radiance and backscatter under nearly consistent physics assumptions via the multisensor satellite simulator, the Goddard Satellite Data Simulator Unit. We present varied examples of simple yet robust methods that characterize forecast errors and model physics biases through the spatial and statistical interpretation of various satellite raw signals: infrared brightness temperature (Tb) for surface skin temperature and cloud top temperature, microwave Tb for precipitation ice and surface flooding, and radar and lidar backscatter for aerosol-cloud profiling simultaneously. Because raw satellite signals integrate many sources of geophysical information, we demonstrate user-defined thresholds and a simple statistical process to facilitate evaluations, including the infrared-microwave-based cloud types and lidar/radar-based profile classifications.
NASA Astrophysics Data System (ADS)
Bordogna, Clelia María; Albano, Ezequiel V.
2007-02-01
The aim of this paper is twofold. On the one hand we present a brief overview on the application of statistical physics methods to the modelling of social phenomena focusing our attention on models for opinion formation. On the other hand, we discuss and present original results of a model for opinion formation based on the social impact theory developed by Latané. The presented model accounts for the interaction among the members of a social group under the competitive influence of a strong leader and the mass media, both supporting two different states of opinion. Extensive simulations of the model are presented, showing that they led to the observation of a rich scenery of complex behaviour including, among others, critical behaviour and phase transitions between a state of opinion dominated by the leader and another dominated by the mass media. The occurrence of interesting finite-size effects reveals that, in small communities, the opinion of the leader may prevail over that of the mass media. This observation is relevant for the understanding of social phenomena involving a finite number of individuals, in contrast to actual physical phase transitions that take place in the thermodynamic limit. Finally, we give a brief outlook of open questions and lines for future work.
NASA Astrophysics Data System (ADS)
Bouaziz, Nadia; Ben Manaa, Marwa; Ben Lamine, Abdelmottaleb
2017-11-01
The hydrogen absorption-desorption isotherms on LaNi3.8Al1.2-xMnx alloy at temperature T = 433 K is studied through various theoretical models. The analytical expressions of these models were deduced exploiting the grand canonical ensemble in statistical physics by taking some simplifying hypotheses. Among these models an adequate model which presents a good correlation with the experimental curves has been selected. The physicochemical parameters intervening in the absorption-desorption processes and involved in the model expressions could be directly deduced from the experimental isotherms by numerical simulation. Six parameters of the model are adjusted, namely the numbers of hydrogen atoms per site n1 and n2, the receptor site densities N1m and N2m, and the energetic parameters P1 and P2. The behaviors of these parameters are discussed in relation with absorption and desorption processes to better understand and compare these phenomena. Thanks to the energetic parameters, we calculated the sorption energies which are typically ranged between 266 and 269.4 KJ/mol for absorption process and between 267 and 269.5 KJ/mol for desorption process comparable to usual chemical bond energies. Using the adopted model expression, the thermodynamic potential functions which govern the absorption/desorption process such as internal energy Eint, free enthalpy of Gibbs G and entropy Sa are derived.
NASA Technical Reports Server (NTRS)
1981-01-01
The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.
NASA Astrophysics Data System (ADS)
Prudden, R.; Arribas, A.; Tomlinson, J.; Robinson, N.
2017-12-01
The Unified Model is a numerical model of the atmosphere used at the UK Met Office (and numerous partner organisations including Korean Meteorological Agency, Australian Bureau of Meteorology and US Air Force) for both weather and climate applications.Especifically, dynamical models such as the Unified Model are now a central part of weather forecasting. Starting from basic physical laws, these models make it possible to predict events such as storms before they have even begun to form. The Unified Model can be simply described as having two components: one component solves the navier-stokes equations (usually referred to as the "dynamics"); the other solves relevant sub-grid physical processes (usually referred to as the "physics"). Running weather forecasts requires substantial computing resources - for example, the UK Met Office operates the largest operational High Performance Computer in Europe - and the cost of a typical simulation is spent roughly 50% in the "dynamics" and 50% in the "physics". Therefore there is a high incentive to reduce cost of weather forecasts and Machine Learning is a possible option because, once a machine learning model has been trained, it is often much faster to run than a full simulation. This is the motivation for a technique called model emulation, the idea being to build a fast statistical model which closely approximates a far more expensive simulation. In this paper we discuss the use of Machine Learning as an emulator to replace the "physics" component of the Unified Model. Various approaches and options will be presented and the implications for further model development, operational running of forecasting systems, development of data assimilation schemes, and development of ensemble prediction techniques will be discussed.
Efficient evaluation of wireless real-time control networks.
Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon
2015-02-11
In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.
Preface: Special Topic on Single-Molecule Biophysics
NASA Astrophysics Data System (ADS)
Makarov, Dmitrii E.; Schuler, Benjamin
2018-03-01
Single-molecule measurements are now almost routinely used to study biological systems and processes. The scope of this special topic emphasizes the physics side of single-molecule observations, with the goal of highlighting new developments in physical techniques as well as conceptual insights that single-molecule measurements bring to biophysics. This issue also comprises recent advances in theoretical physical models of single-molecule phenomena, interpretation of single-molecule signals, and fundamental areas of statistical mechanics that are related to single-molecule observations. A particular goal is to illustrate the increasing synergy between theory, simulation, and experiment in single-molecule biophysics.
Statistical representation of a spray as a point process
NASA Astrophysics Data System (ADS)
Subramaniam, S.
2000-10-01
The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed.
NASA Astrophysics Data System (ADS)
Crosby, N.; Georgoulis, M.; Vilmer, N.
1999-10-01
Solar burst observations in the deka-keV energy range originating from the WATCH experiment aboard the GRANAT spacecraft were used to perform frequency distributions built on measured X-ray flare parameters (Crosby et al., 1998). The results of the study show that: 1- the overall distribution functions are robust power laws extending over a number of decades. The typical parameters of events (total counts, peak count rates, duration) are all correlated to each other. 2- the overall distribution functions are the convolution of significantly different distribution functions built on parts of the whole data set filtered by the event duration. These "partial" frequency distributions are still power law distributions over several decades, with a slope systematically decreasing with increasing duration. 3- No correlation is found between the elapsed time interval between successive bursts arising from the same active region and the peak intensity of the flare. In this paper, we attempt a tentative comparison between the statistical properties of the self-organized critical (SOC) cellular automaton statistical flare models (see e.g. Lu and Hamilton (1991), Georgoulis and Vlahos (1996, 1998)) and the respective properties of the WATCH flare data. Despite the inherent weaknesses of the SOC models to simulate a number of physical processes in the active region, it is found that most of the observed statistical properties can be reproduced using the SOC models, including the various frequency distributions and scatter plots. We finally conclude that, even if SOC models must be refined to improve the physical links to MHD approaches, they nevertheless represent a good approach to describe the properties of rapid energy dissipation and magnetic field annihilation in complex and magnetized plasmas. Crosby N., Vilmer N., Lund N. and Sunyaev R., A&A; 334; 299-313; 1998 Crosby N., Lund N., Vilmer N. and Sunyaev R.; A&A Supplement Series; 130, 233, 1998 Georgoulis M. and Vlahos L., 1996, Astrophy. J. Letters, 469, L135 Georgoulis M. and Vlahos L., 1998, in preparation Lu E.T. and Hamilton R.J., 1991, Astroph. J., 380, L89
NASA Technical Reports Server (NTRS)
Petty, Grant W.
1990-01-01
A reasonably rigorous basis for understanding and extracting the physical information content of Special Sensor Microwave/Imager (SSM/I) satellite images of the marine environment is provided. To this end, a comprehensive algebraic parameterization is developed for the response of the SSM/I to a set of nine atmospheric and ocean surface parameters. The brightness temperature model includes a closed-form approximation to microwave radiative transfer in a non-scattering atmosphere and fitted models for surface emission and scattering based on geometric optics calculations for the roughened sea surface. The combined model is empirically tuned using suitable sets of SSM/I data and coincident surface observations. The brightness temperature model is then used to examine the sensitivity of the SSM/I to realistic variations in the scene being observed and to evaluate the theoretical maximum precision of global SSM/I retrievals of integrated water vapor, integrated cloud liquid water, and surface wind speed. A general minimum-variance method for optimally retrieving geophysical parameters from multichannel brightness temperature measurements is outlined, and several global statistical constraints of the type required by this method are computed. Finally, a unified set of efficient statistical and semi-physical algorithms is presented for obtaining fields of surface wind speed, integrated water vapor, cloud liquid water, and precipitation from SSM/I brightness temperature data. Features include: a semi-physical method for retrieving integrated cloud liquid water at 15 km resolution and with rms errors as small as approximately 0.02 kg/sq m; a 3-channel statistical algorithm for integrated water vapor which was constructed so as to have improved linear response to water vapor and reduced sensitivity to precipitation; and two complementary indices of precipitation activity (based on 37 GHz attenuation and 85 GHz scattering, respectively), each of which are relatively insensitive to variations in other environmental parameters.
Persistence of discrimination: Revisiting Axtell, Epstein and Young
NASA Astrophysics Data System (ADS)
Weisbuch, Gérard
2018-02-01
We reformulate an earlier model of the "Emergence of classes..." proposed by Axtell et al. (2001) using more elaborate cognitive processes allowing a statistical physics approach. The thorough analysis of the phase space and of the basins of attraction leads to a reconsideration of the previous social interpretations: our model predicts the reinforcement of discrimination biases and their long term stability rather than the emergence of classes.
MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.
Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk
2018-05-29
Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.
Electromagnetic sinc Schell-model beams and their statistical properties.
Mei, Zhangrong; Mao, Yonghua
2014-09-22
A class of electromagnetic sources with sinc Schell-model correlations is introduced. The conditions on source parameters guaranteeing that the source generates a physical beam are derived. The evolution behaviors of statistical properties for the electromagnetic stochastic beams generated by this new source on propagating in free space and in atmosphere turbulence are investigated with the help of the weighted superposition method and by numerical simulations. It is demonstrated that the intensity distributions of such beams exhibit unique features on propagating in free space and produce a double-layer flat-top profile of being shape-invariant in the far field. This feature makes this new beam particularly suitable for some special laser processing applications. The influences of the atmosphere turbulence with a non-Kolmogorov power spectrum on statistical properties of the new beams are analyzed in detail.
Advanced statistical energy analysis
NASA Astrophysics Data System (ADS)
Heron, K. H.
1994-09-01
A high-frequency theory (advanced statistical energy analysis (ASEA)) is developed which takes account of the mechanism of tunnelling and uses a ray theory approach to track the power flowing around a plate or a beam network and then uses statistical energy analysis (SEA) to take care of any residual power. ASEA divides the energy of each sub-system into energy that is freely available for transfer to other sub-systems and energy that is fixed within the sub-systems that are physically separate and can be interpreted as a series of mathematical models, the first of which is identical to standard SEA and subsequent higher order models are convergent on an accurate prediction. Using a structural assembly of six rods as an example, ASEA is shown to converge onto the exact results while SEA is shown to overpredict by up to 60 dB.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Dong; Liu, Yangang
2014-12-18
Subgrid-scale variability is one of the main reasons why parameterizations are needed in large-scale models. Although some parameterizations started to address the issue of subgrid variability by introducing a subgrid probability distribution function for relevant quantities, the spatial structure has been typically ignored and thus the subgrid-scale interactions cannot be accounted for physically. Here we present a new statistical-physics-like approach whereby the spatial autocorrelation function can be used to physically capture the net effects of subgrid cloud interaction with radiation. The new approach is able to faithfully reproduce the Monte Carlo 3D simulation results with several orders less computational cost,more » allowing for more realistic representation of cloud radiation interactions in large-scale models.« less
NASA Technical Reports Server (NTRS)
Davis, Brynmor; Kim, Edward; Piepmeier, Jeffrey; Hildebrand, Peter H. (Technical Monitor)
2001-01-01
Many new Earth remote-sensing instruments are embracing both the advantages and added complexity that result from interferometric or fully polarimetric operation. To increase instrument understanding and functionality a model of the signals these instruments measure is presented. A stochastic model is used as it recognizes the non-deterministic nature of any real-world measurements while also providing a tractable mathematical framework. A stationary, Gaussian-distributed model structure is proposed. Temporal and spectral correlation measures provide a statistical description of the physical properties of coherence and polarization-state. From this relationship the model is mathematically defined. The model is shown to be unique for any set of physical parameters. A method of realizing the model (necessary for applications such as synthetic calibration-signal generation) is given and computer simulation results are presented. The signals are constructed using the output of a multi-input multi-output linear filter system, driven with white noise.
MUSiC - Model-independent search for deviations from Standard Model predictions in CMS
NASA Astrophysics Data System (ADS)
Pieta, Holger
2010-02-01
We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )
Estimating and Testing the Sources of Evoked Potentials in the Brain.
ERIC Educational Resources Information Center
Huizenga, Hilde M.; Molenaar, Peter C. M.
1994-01-01
The source of an event-related brain potential (ERP) is estimated from multivariate measures of ERP on the head under several mathematical and physical constraints on the parameters of the source model. Statistical aspects of estimation are discussed, and new tests are proposed. (SLD)
Are Three Sheets Enough? Using Toilet Paper to Teach Science and Mathematics
ERIC Educational Resources Information Center
Woolverton, Christopher J.; Woolverton, Lyssa N.
2006-01-01
Toilet paper (TP) composition and physical characteristics were used to model scientific investigations that combined several "National Science Education Standards." Experiments with TP permitted the integration of TP history, societal change resulting from invention, mathematics (including geometry and statistics), germ theory, and personal…
NASA Astrophysics Data System (ADS)
Sutherland, Richard L.
2002-12-01
Polarization properties and electro-optical switching behavior of holographic polymer-dispersed liquid-crystal (HPDLC) reflection and transmission gratings are studied. A theoretical model is developed that combines anisotropic coupled-wave theory with an elongated liquid-crystal-droplet switching model and includes the effects of a statistical orientational distribution of droplet-symmetry axes. Angle- and polarization-dependent switching behaviors of HPDLC gratings are elucidated, and the effects on dynamic range are described. A new type of electro-optical switching not seen in ordinary polymer-dispersed liquid crystals, to the best of the author's knowledge, is presented and given a physical interpretation. The model provides valuable insight to the physics of these gratings and can be applied to the design of HPDLC holographic optical elements.
Natural Selection as Coarsening
NASA Astrophysics Data System (ADS)
Smerlak, Matteo
2017-11-01
Analogies between evolutionary dynamics and statistical mechanics, such as Fisher's second-law-like "fundamental theorem of natural selection" and Wright's "fitness landscapes", have had a deep and fruitful influence on the development of evolutionary theory. Here I discuss a new conceptual link between evolution and statistical physics. I argue that natural selection can be viewed as a coarsening phenomenon, similar to the growth of domain size in quenched magnets or to Ostwald ripening in alloys and emulsions. In particular, I show that the most remarkable features of coarsening—scaling and self-similarity—have strict equivalents in evolutionary dynamics. This analogy has three main virtues: it brings a set of well-developed mathematical tools to bear on evolutionary dynamics; it suggests new problems in theoretical evolution; and it provides coarsening physics with a new exactly soluble model.
Natural Selection as Coarsening
NASA Astrophysics Data System (ADS)
Smerlak, Matteo
2018-07-01
Analogies between evolutionary dynamics and statistical mechanics, such as Fisher's second-law-like "fundamental theorem of natural selection" and Wright's "fitness landscapes", have had a deep and fruitful influence on the development of evolutionary theory. Here I discuss a new conceptual link between evolution and statistical physics. I argue that natural selection can be viewed as a coarsening phenomenon, similar to the growth of domain size in quenched magnets or to Ostwald ripening in alloys and emulsions. In particular, I show that the most remarkable features of coarsening—scaling and self-similarity—have strict equivalents in evolutionary dynamics. This analogy has three main virtues: it brings a set of well-developed mathematical tools to bear on evolutionary dynamics; it suggests new problems in theoretical evolution; and it provides coarsening physics with a new exactly soluble model.
Finding Bounded Rational Equilibria. Part 1; Iterative Focusing
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2004-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights from the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.
NASA Technical Reports Server (NTRS)
Hough, D. H.; Readhead, A. C. S.
1989-01-01
A complete, flux-density-limited sample of double-lobed radio quasars is defined, with nuclei bright enough to be mapped with the Mark III VLBI system. It is shown that the statistics of linear size, nuclear strength, and curvature are consistent with the assumption of random source orientations and simple relativistic beaming in the nuclei. However, these statistics are also consistent with the effects of interaction between the beams and the surrounding medium. The distribution of jet velocities in the nuclei, as measured with VLBI, will provide a powerful test of physical theories of extragalactic radio sources.
NASA Astrophysics Data System (ADS)
Nelson, Philip
2015-03-01
I'll describe an intermediate-level course on ``Physical Models of Living Systems.'' The only prerequisite is first-year university physics and calculus. The course is a response to rapidly growing interest among undergraduates in a broad range of science and engineering majors. Students acquire several research skills that are often not addressed in traditional courses:
Science of Ball Lightning (Fire Ball)
NASA Astrophysics Data System (ADS)
Ohtsuki, Yoshi-Hiko
1989-08-01
The Table of Contents for the full book PDF is as follows: * Organizing Committee * Preface * Ball Lightning -- The Continuing Challenge * Hungarian Ball Lightning Observations in 1987 * Nature of Ball Lightning in Japan * Phenomenological and Psychological Analysis of 150 Austrian Ball Lightning Reports * Physical Problems and Physical Properties of Ball Lightning * Statistical Analysis of the Ball Lightning Properties * A Fluid-Dynamical Model for Ball Lightning and Bead Lightning * The Lifetime of Hill's Vortex * Electrical and Radiative Properties of Ball Lightning * The Candle Flame as a Model of Ball Lightning * A Model for Ball Lightning * The High-Temperature Physico-Chemical Processes in the Lightning Storm Atmosphere (A Physico-Chemical Model of Ball Lightning) * New Approach to Ball Lightning * A Calculation of Electric Field of Ball Lightning * The Physical Explanation to the UFO over Xinjiang, Northern West China * Electric Reconnection, Critical Ionization Velocity, Ponderomotive Force, and Their Applications to Triggered and Ball Lightning * The PLASMAK™ Configuration and Ball Lightning * Experimental Research on Ball Lightning * Performance of High-Voltage Test Facility Designed for Investigation of Ball Lightning * List of Participants
Model-Based Anomaly Detection for a Transparent Optical Transmission System
NASA Astrophysics Data System (ADS)
Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.
In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.
Flow throughout the Earth's core inverted from geomagnetic observations and numerical dynamo models
NASA Astrophysics Data System (ADS)
Aubert, Julien
2013-02-01
This paper introduces inverse geodynamo modelling, a framework imaging flow throughout the Earth's core from observations of the geomagnetic field and its secular variation. The necessary prior information is provided by statistics from 3-D and self-consistent numerical simulations of the geodynamo. The core method is a linear estimation (or Kalman filtering) procedure, combined with standard frozen-flux core surface flow inversions in order to handle the non-linearity of the problem. The inversion scheme is successfully validated using synthetic test experiments. A set of four numerical dynamo models of increasing physical complexity and similarity to the geomagnetic field is then used to invert for flows at single epochs within the period 1970-2010, using data from the geomagnetic field models CM4 and gufm-sat-Q3. The resulting core surface flows generally provide satisfactory fits to the secular variation within the level of modelled errors, and robustly reproduce the most commonly observed patterns while additionally presenting a high degree of equatorial symmetry. The corresponding deep flows present a robust, highly columnar structure once rotational constraints are enforced to a high level in the prior models, with patterns strikingly similar to the results of quasi-geostrophic inversions. In particular, the presence of a persistent planetary scale, eccentric westward columnar gyre circling around the inner core is confirmed. The strength of the approach is to uniquely determine the trade-off between fit to the data and complexity of the solution by clearly connecting it to first principle physics; statistical deviations observed between the inverted flows and the standard model behaviour can then be used to quantitatively assess the shortcomings of the physical modelling. Such deviations include the (i) westwards and (ii) hemispherical character of the eccentric gyre. A prior model with angular momentum conservation of the core-mantle inner-core system, and gravitational coupling of reasonable strength between the mantle and the inner core, is shown to produce enough westward drift to resolve statistical deviation (i). Deviation (ii) is resolved by a prior with an hemispherical buoyancy release at the inner-core boundary, with excess buoyancy below Asia. This latter result suggests that the recently proposed inner-core translational instability presently transports the solid inner-core material westwards, opposite to the seismologically inferred long-term trend but consistently with the eccentricity of the geomagnetic dipole in recent times.
Bubbles, shocks and elementary technical trading strategies
NASA Astrophysics Data System (ADS)
Fry, John
2014-01-01
In this paper we provide a unifying framework for a set of seemingly disparate models for bubbles, shocks and elementary technical trading strategies in financial markets. Markets operate by balancing intrinsic levels of risk and return. This seemingly simple observation is commonly over-looked by academics and practitioners alike. Our model shares its origins in statistical physics with others. However, under our approach, changes in market regime can be explicitly shown to represent a phase transition from random to deterministic behaviour in prices. This structure leads to an improved physical and econometric model. We develop models for bubbles, shocks and elementary technical trading strategies. The list of empirical applications is both interesting and topical and includes real-estate bubbles and the on-going Eurozone crisis. We close by comparing the results of our model with purely qualitative findings from the finance literature.
Akkus, Zeki; Camdeviren, Handan; Celik, Fatma; Gur, Ali; Nas, Kemal
2005-09-01
To determine the risk factors of osteoporosis using a multiple binary logistic regression method and to assess the risk variables for osteoporosis, which is a major and growing health problem in many countries. We presented a case-control study, consisting of 126 postmenopausal healthy women as control group and 225 postmenopausal osteoporotic women as the case group. The study was carried out in the Department of Physical Medicine and Rehabilitation, Dicle University, Diyarbakir, Turkey between 1999-2002. The data from the 351 participants were collected using a standard questionnaire that contains 43 variables. A multiple logistic regression model was then used to evaluate the data and to find the best regression model. We classified 80.1% (281/351) of the participants using the regression model. Furthermore, the specificity value of the model was 67% (84/126) of the control group while the sensitivity value was 88% (197/225) of the case group. We found the distribution of residual values standardized for final model to be exponential using the Kolmogorow-Smirnow test (p=0.193). The receiver operating characteristic curve was found successful to predict patients with risk for osteoporosis. This study suggests that low levels of dietary calcium intake, physical activity, education, and longer duration of menopause are independent predictors of the risk of low bone density in our population. Adequate dietary calcium intake in combination with maintaining a daily physical activity, increasing educational level, decreasing birth rate, and duration of breast-feeding may contribute to healthy bones and play a role in practical prevention of osteoporosis in Southeast Anatolia. In addition, the findings of the present study indicate that the use of multivariate statistical method as a multiple logistic regression in osteoporosis, which maybe influenced by many variables, is better than univariate statistical evaluation.
NASA Astrophysics Data System (ADS)
Mercer, Gary J.
This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.
Uchino, Bert N.; Bowen, Kimberly; Carlisle, McKenzie; Birmingham, Wendy
2012-01-01
Contemporary models postulate the importance of psychological mechanisms linking perceived and received social support to physical health outcomes. In this review, we examine studies that directly tested the potential psychological mechanisms responsible for links between social support and health-relevant physiological processes (1980s to 2010). Inconsistent with existing theoretical models, no evidence was found that psychological mechanisms such as depression, perceived stress, and other affective processes are directly responsible for links between support and health. We discuss the importance of considering statistical/design issues, emerging conceptual perspectives, and limitations of our existing models for future research aimed at elucidating the psychological mechanisms responsible for links between social support and physical health outcomes. PMID:22326104
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Garrison N.; Atamturktur, Sez; Brown, D. Andrew
Rapid advancements in parallel computing over the last two decades have enabled simulations of complex, coupled systems through partitioning. In partitioned analysis, independently developed constituent models communicate, representing dependencies between multiple physical phenomena that occur in the full system. Figure 1 schematically demonstrates a coupled system with two constituent models, each resolving different physical behavior. In this figure, the constituent model, denoted as the “consumer,” relies upon some input parameter that is being provided by the constituent model acting as a “feeder”. The role of the feeder model is to map operating conditions (i.e. those that are stimulating the process)more » to consumer inputs, thus providing functional inputs to the consumer model*. Problems arise if the feeder model cannot be built–a challenge that is prevalent for highly complex systems in extreme operational conditions that push the limits of our understanding of underlying physical behavior. Often, these are also the situations where separate-effect experiments isolating the physical phenomena are not available; meaning that experimentally determining the unknown constituent behavior is not possible (Bauer and Holland, 1995; Unal et al., 2013), and that integral-effect experiments that reflect the behavior of the complete system tend to be the only available observations. In this paper, the authors advocate for the usefulness of integral-effect experiments in furthering a model developer’s knowledge of the physics principles governing the system behavior of interest.« less
Stevens, Garrison N.; Atamturktur, Sez; Brown, D. Andrew; ...
2018-04-16
Rapid advancements in parallel computing over the last two decades have enabled simulations of complex, coupled systems through partitioning. In partitioned analysis, independently developed constituent models communicate, representing dependencies between multiple physical phenomena that occur in the full system. Figure 1 schematically demonstrates a coupled system with two constituent models, each resolving different physical behavior. In this figure, the constituent model, denoted as the “consumer,” relies upon some input parameter that is being provided by the constituent model acting as a “feeder”. The role of the feeder model is to map operating conditions (i.e. those that are stimulating the process)more » to consumer inputs, thus providing functional inputs to the consumer model*. Problems arise if the feeder model cannot be built–a challenge that is prevalent for highly complex systems in extreme operational conditions that push the limits of our understanding of underlying physical behavior. Often, these are also the situations where separate-effect experiments isolating the physical phenomena are not available; meaning that experimentally determining the unknown constituent behavior is not possible (Bauer and Holland, 1995; Unal et al., 2013), and that integral-effect experiments that reflect the behavior of the complete system tend to be the only available observations. In this paper, the authors advocate for the usefulness of integral-effect experiments in furthering a model developer’s knowledge of the physics principles governing the system behavior of interest.« less
NASA Astrophysics Data System (ADS)
Ghotbi, Saba; Sotoudeheian, Saeed; Arhami, Mohammad
2016-09-01
Satellite remote sensing products of AOD from MODIS along with appropriate meteorological parameters were used to develop statistical models and estimate ground-level PM10. Most of previous studies obtained meteorological data from synoptic weather stations, with rather sparse spatial distribution, and used it along with 10 km AOD product to develop statistical models, applicable for PM variations in regional scale (resolution of ≥10 km). In the current study, meteorological parameters were simulated with 3 km resolution using WRF model and used along with the rather new 3 km AOD product (launched in 2014). The resulting PM statistical models were assessed for a polluted and largely variable urban area, Tehran, Iran. Despite the critical particulate pollution problem, very few PM studies were conducted in this area. The issue of rather poor direct PM-AOD associations existed, due to different factors such as variations in particles optical properties, in addition to bright background issue for satellite data, as the studied area located in the semi-arid areas of Middle East. Statistical approach of linear mixed effect (LME) was used, and three types of statistical models including single variable LME model (using AOD as independent variable) and multiple variables LME model by using meteorological data from two sources, WRF model and synoptic stations, were examined. Meteorological simulations were performed using a multiscale approach and creating an appropriate physic for the studied region, and the results showed rather good agreements with recordings of the synoptic stations. The single variable LME model was able to explain about 61%-73% of daily PM10 variations, reflecting a rather acceptable performance. Statistical models performance improved through using multivariable LME and incorporating meteorological data as auxiliary variables, particularly by using fine resolution outputs from WRF (R2 = 0.73-0.81). In addition, rather fine resolution for PM estimates was mapped for the studied city, and resulting concentration maps were consistent with PM recordings at the existing stations.
ecode - Electron Transport Algorithm Testing v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franke, Brian C.; Olson, Aaron J.; Bruss, Donald Eugene
2016-10-05
ecode is a Monte Carlo code used for testing algorithms related to electron transport. The code can read basic physics parameters, such as energy-dependent stopping powers and screening parameters. The code permits simple planar geometries of slabs or cubes. Parallelization consists of domain replication, with work distributed at the start of the calculation and statistical results gathered at the end of the calculation. Some basic routines (such as input parsing, random number generation, and statistics processing) are shared with the Integrated Tiger Series codes. A variety of algorithms for uncertainty propagation are incorporated based on the stochastic collocation and stochasticmore » Galerkin methods. These permit uncertainty only in the total and angular scattering cross sections. The code contains algorithms for simulating stochastic mixtures of two materials. The physics is approximate, ranging from mono-energetic and isotropic scattering to screened Rutherford angular scattering and Rutherford energy-loss scattering (simple electron transport models). No production of secondary particles is implemented, and no photon physics is implemented.« less
Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
2010-08-15
One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less
Low-complexity stochastic modeling of wall-bounded shear flows
NASA Astrophysics Data System (ADS)
Zare, Armin
Turbulent flows are ubiquitous in nature and they appear in many engineering applications. Transition to turbulence, in general, increases skin-friction drag in air/water vehicles compromising their fuel-efficiency and reduces the efficiency and longevity of wind turbines. While traditional flow control techniques combine physical intuition with costly experiments, their effectiveness can be significantly enhanced by control design based on low-complexity models and optimization. In this dissertation, we develop a theoretical and computational framework for the low-complexity stochastic modeling of wall-bounded shear flows. Part I of the dissertation is devoted to the development of a modeling framework which incorporates data-driven techniques to refine physics-based models. We consider the problem of completing partially known sample statistics in a way that is consistent with underlying stochastically driven linear dynamics. Neither the statistics nor the dynamics are precisely known. Thus, our objective is to reconcile the two in a parsimonious manner. To this end, we formulate optimization problems to identify the dynamics and directionality of input excitation in order to explain and complete available covariance data. For problem sizes that general-purpose solvers cannot handle, we develop customized optimization algorithms based on alternating direction methods. The solution to the optimization problem provides information about critical directions that have maximal effect in bringing model and statistics in agreement. In Part II, we employ our modeling framework to account for statistical signatures of turbulent channel flow using low-complexity stochastic dynamical models. We demonstrate that white-in-time stochastic forcing is not sufficient to explain turbulent flow statistics and develop models for colored-in-time forcing of the linearized Navier-Stokes equations. We also examine the efficacy of stochastically forced linearized NS equations and their parabolized equivalents in the receptivity analysis of velocity fluctuations to external sources of excitation as well as capturing the effect of the slowly-varying base flow on streamwise streaks and Tollmien-Schlichting waves. In Part III, we develop a model-based approach to design surface actuation of turbulent channel flow in the form of streamwise traveling waves. This approach is capable of identifying the drag reducing trends of traveling waves in a simulation-free manner. We also use the stochastically forced linearized NS equations to examine the Reynolds number independent effects of spanwise wall oscillations on drag reduction in turbulent channel flows. This allows us to extend the predictive capability of our simulation-free approach to high Reynolds numbers.
[Physical activity, obesity and self-esteem in chilean schoolchildren].
Zurita-Ortega, Félix; Castro-Sánchez, Manuel; Rodríguez-Fernández, Sonia; Cofré-Boladós, Cristian; Chacón-Cuberos, Ramón; Martínez-Martínez, Asunción; Muros-Molina, José Joaquín
2017-03-01
Obesity is a worldwide epidemic disease and a problem for the Chilean society. To analyze the relationship between physical condition, body mass index (BMI), level of physical activity and self-esteem. Material ad Methods: Questionnaires to assess self-esteem (Rosemberg scale) and levels of physical activity (Physical Activity Questionnaire for older Children, PAQ-C) were answered by 515 children aged 10.5 ± 0.5 years from 27 schools of Santiago de Chile. BMI was calculated. Course-Navette test was carried out, vertical jump and hand dynamometry were measured. For statistical analysis, structural equations were used. An acceptable goodness of fit for the models was found. There was a positive relationship between BMI and hand dynamometry, as well as a negative relationship between BMI and maximal oxygen consumption, jumping ability, physical activity and self-esteem. Finally, self-esteem was positively related to physical activity engagement. In these children, self-esteem was related to physical activity variables.
Identifying trends in climate: an application to the cenozoic
NASA Astrophysics Data System (ADS)
Richards, Gordon R.
1998-05-01
The recent literature on trending in climate has raised several issues, whether trends should be modeled as deterministic or stochastic, whether trends are nonlinear, and the relative merits of statistical models versus models based on physics. This article models trending since the late Cretaceous. This 68 million-year interval is selected because the reliability of tests for trending is critically dependent on the length of time spanned by the data. Two main hypotheses are tested, that the trend has been caused primarily by CO2 forcing, and that it reflects a variety of forcing factors which can be approximated by statistical methods. The CO2 data is obtained from model simulations. Several widely-used statistical models are found to be inadequate. ARIMA methods parameterize too much of the short-term variation, and do not identify low frequency movements. Further, the unit root in the ARIMA process does not predict the long-term path of temperature. Spectral methods also have little ability to predict temperature at long horizons. Instead, the statistical trend is estimated using a nonlinear smoothing filter. Both of these paradigms make it possible to model climate as a cointegrated process, in which temperature can wander quite far from the trend path in the intermediate term, but converges back over longer horizons. Comparing the forecasting properties of the two trend models demonstrates that the optimal forecasting model includes CO2 forcing and a parametric representation of the nonlinear variability in climate.
NASA Astrophysics Data System (ADS)
Bramwell, Steven T.; Gingras, Michel J. P.; Holdsworth, Peter C. W.
2013-03-01
Pauling's model of hydrogen disorder in water ice represents the prototype of a frustrated system. Over the years it has spawned several analogous models, including Anderson's model antiferromagnet and the statistical "vertex" models. Spin Ice is a sixteen vertex model of "ferromagnetic frustration" that is approximated by real materials, most notably the rare earth pyrochlores Ho2Ti2O7, Dy2Ti2O7 and Ho2Sn2O7. These "spin ice materials" have the Pauling zero point entropy and in all respects represent almost ideal realisations of Pauling's model. They provide experimentalists with unprecedented access to a wide variety of novel magnetic states and phase transitions that are located in different regions of the field-temperature phase diagram. They afford theoreticians the opportunity to explore many new features of the magnetic interactions and statistical mechanics of frustrated systems. This chapter is a comprehensive review of the physics -- both experimental and theoretical -- of spin ice. It starts with a discussion of the historic problem of water ice and its relation to spin ice and other frustrated magnets. The properties of spin ice are then discussed in three sections that deal with the zero field spin ice state, the numerous field-induced states (including the recently identified "kagomé ice") and the magnetic dynamics. Some materials related to spin ice are briefly described and the chapter is concluded with a short summary of spin ice physics.
A Localized Ensemble Kalman Smoother
NASA Technical Reports Server (NTRS)
Butala, Mark D.
2012-01-01
Numerous geophysical inverse problems prove difficult because the available measurements are indirectly related to the underlying unknown dynamic state and the physics governing the system may involve imperfect models or unobserved parameters. Data assimilation addresses these difficulties by combining the measurements and physical knowledge. The main challenge in such problems usually involves their high dimensionality and the standard statistical methods prove computationally intractable. This paper develops and addresses the theoretical convergence of a new high-dimensional Monte-Carlo approach called the localized ensemble Kalman smoother.
Reconstructing Macroeconomics Based on Statistical Physics
NASA Astrophysics Data System (ADS)
Aoki, Masanao; Yoshikawa, Hiroshi
We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.
Inflated Uncertainty in Multimodel-Based Regional Climate Projections.
Madsen, Marianne Sloth; Langen, Peter L; Boberg, Fredrik; Christensen, Jens Hesselbjerg
2017-11-28
Multimodel ensembles are widely analyzed to estimate the range of future regional climate change projections. For an ensemble of climate models, the result is often portrayed by showing maps of the geographical distribution of the multimodel mean results and associated uncertainties represented by model spread at the grid point scale. Here we use a set of CMIP5 models to show that presenting statistics this way results in an overestimation of the projected range leading to physically implausible patterns of change on global but also on regional scales. We point out that similar inconsistencies occur in impact analyses relying on multimodel information extracted using statistics at the regional scale, for example, when a subset of CMIP models is selected to represent regional model spread. Consequently, the risk of unwanted impacts may be overestimated at larger scales as climate change impacts will never be realized as the worst (or best) case everywhere.
Multiscale volatility duration characteristics on financial multi-continuum percolation dynamics
NASA Astrophysics Data System (ADS)
Wang, Min; Wang, Jun
A random stock price model based on the multi-continuum percolation system is developed to investigate the nonlinear dynamics of stock price volatility duration, in an attempt to explain various statistical facts found in financial data, and have a deeper understanding of mechanisms in the financial market. The continuum percolation system is usually referred to be a random coverage process or a Boolean model, it is a member of a class of statistical physics systems. In this paper, the multi-continuum percolation (with different values of radius) is employed to model and reproduce the dispersal of information among the investors. To testify the rationality of the proposed model, the nonlinear analyses of return volatility duration series are preformed by multifractal detrending moving average analysis and Zipf analysis. The comparison empirical results indicate the similar nonlinear behaviors for the proposed model and the actual Chinese stock market.
Multivariate space - time analysis of PRE-STORM precipitation
NASA Technical Reports Server (NTRS)
Polyak, Ilya; North, Gerald R.; Valdes, Juan B.
1994-01-01
This paper presents the methodologies and results of the multivariate modeling and two-dimensional spectral and correlation analysis of PRE-STORM rainfall gauge data. Estimated parameters of the models for the specific spatial averages clearly indicate the eastward and southeastward wave propagation of rainfall fluctuations. A relationship between the coefficients of the diffusion equation and the parameters of the stochastic model of rainfall fluctuations is derived that leads directly to the exclusive use of rainfall data to estimate advection speed (about 12 m/s) as well as other coefficients of the diffusion equation of the corresponding fields. The statistical methodology developed here can be used for confirmation of physical models by comparison of the corresponding second-moment statistics of the observed and simulated data, for generating multiple samples of any size, for solving the inverse problem of the hydrodynamic equations, and for application in some other areas of meteorological and climatological data analysis and modeling.
A Goddard Multi-Scale Modeling System with Unified Physics
NASA Technical Reports Server (NTRS)
Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.;
2008-01-01
Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite simulator has been developed at GSFC, which is designed to fully utilize the multi-scale modeling system. A brief review of the multi-scale modeling system with unified physics/simulator and examples is presented in this article.
Travelling Bubble Cavitation and Resulting Noise.
1981-03-02
pp. 22-26, 1968. 16. Il’ichev, V. I. "Statistical Model of the Onset of Hydrodynamic Cavitation Noise," Sixth All-Union Acoustic Conference...Collected Papers, Moscow, 1968. 17. Lyamshev, L. M. "On the Theory of Hydrodynamic Cavitation Noise," Soviet Physics-Acoustics, Vol. 15, pp. 494-498, 1970. 18
Multi-Parameter Linear Least-Squares Fitting to Poisson Data One Count at a Time
NASA Technical Reports Server (NTRS)
Wheaton, W.; Dunklee, A.; Jacobson, A.; Ling, J.; Mahoney, W.; Radocinski, R.
1993-01-01
A standard problem in gamma-ray astronomy data analysis is the decomposition of a set of observed counts, described by Poisson statistics, according to a given multi-component linear model, with underlying physical count rates or fluxes which are to be estimated from the data.
Ocean Surface Wave Optical Roughness - Innovative Measurement and Modeling
2008-01-01
Gemmrich et al., 2008) and microscale breaker crest length spectral density (e.g. Jessup and Phadnis , 2005) have been reported. Our effort seeks to...1986: Statistics of breaking waves observed as whitecaps in the open sea, Journal of Physical Oceanography, 16, 290-297. Jessup, A.T. and Phadnis
Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics
NASA Astrophysics Data System (ADS)
Lazarus, S. M.; Holman, B. P.; Splitt, M. E.
2017-12-01
A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.
Comparison of GEANT4 very low energy cross section models with experimental data in water.
Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C
2010-09-01
The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.
Essiet, Inimfon Aniema; Baharom, Anisah; Shahar, Hayati Kadir; Uzochukwu, Benjamin
2017-01-01
Physical activity among university students is a catalyst for habitual physical activity in adulthood. Physical activity has many health benefits besides the improvement in academic performance. The present study assessed the predictors of physical activity among Nigerian university students using the Social Ecological Model (SEM). This cross-sectional study recruited first-year undergraduate students in the University of Uyo, Nigeria by multistage sampling. The International Physical Activity Questionnaire (IPAQ) short-version was used to assess physical activity in the study. Factors were categorised according to the Socio-Ecological Model which consisted of individual, social environment, physical environment and policy level. Data was analysed using the IBM SPSS statistical software, version 22. Simple and multiple logistic regression were used to determine the predictors of sufficient physical activity. A total of 342 respondents completed the study questionnaire. Majority of the respondents (93.6%) reported sufficient physical activity at 7-day recall. Multivariate analysis revealed that respondents belonging to the Ibibio ethnic group were about four times more likely to be sufficiently active compared to those who belonged to the other ethnic groups (AOR = 3.725, 95% CI = 1.383 to 10.032). Also, participants who had a normal weight were about four times more likely to be physically active compared to those who were underweight (AOR = 4.268, 95% CI = 1.323 to 13.772). This study concluded that there was sufficient physical activity levels among respondents. It is suggested that emphasis be given to implementing interventions aimed at sustaining sufficient levels of physical activity among students.
Creation operator for spinons in one dimension
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talstra, J.C.; Strong, S.P.
1997-09-01
We propose a definition for a creation operator for the spinon, the fractional statistics elementary excitation of the Haldane-Shastry model, and give numerical and analytical evidence that our operator creates a single spinon with nearly unit amplitude in the Heisenberg model with inverse squared exchange. We then discuss how the operator is useful in more general contexts such as studying the underlying spinons of other spin-chain models, like the XXX and XY model, and of the one-dimensional Hubbard model. {copyright} {ital 1997} {ital The American Physical Society}
A new statistical approach to climate change detection and attribution
NASA Astrophysics Data System (ADS)
Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe
2017-01-01
We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).
Statistical Models of Fracture Relevant to Nuclear-Grade Graphite: Review and Recommendations
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Bratton, Robert L.
2011-01-01
The nuclear-grade (low-impurity) graphite needed for the fuel element and moderator material for next-generation (Gen IV) reactors displays large scatter in strength and a nonlinear stress-strain response from damage accumulation. This response can be characterized as quasi-brittle. In this expanded review, relevant statistical failure models for various brittle and quasi-brittle material systems are discussed with regard to strength distribution, size effect, multiaxial strength, and damage accumulation. This includes descriptions of the Weibull, Batdorf, and Burchell models as well as models that describe the strength response of composite materials, which involves distributed damage. Results from lattice simulations are included for a physics-based description of material breakdown. Consideration is given to the predicted transition between brittle and quasi-brittle damage behavior versus the density of damage (level of disorder) within the material system. The literature indicates that weakest-link-based failure modeling approaches appear to be reasonably robust in that they can be applied to materials that display distributed damage, provided that the level of disorder in the material is not too large. The Weibull distribution is argued to be the most appropriate statistical distribution to model the stochastic-strength response of graphite.
Statistical Maps of Ground Magnetic Disturbance Derived from Global Geospace Models
NASA Astrophysics Data System (ADS)
Rigler, E. J.; Wiltberger, M. J.; Love, J. J.
2017-12-01
Electric currents in space are the principal driver of magnetic variations measured at Earth's surface. These in turn induce geoelectric fields that present a natural hazard for technological systems like high-voltage power distribution networks. Modern global geospace models can reasonably simulate large-scale geomagnetic response to solar wind variations, but they are less successful at deterministic predictions of intense localized geomagnetic activity that most impacts technological systems on the ground. Still, recent studies have shown that these models can accurately reproduce the spatial statistical distributions of geomagnetic activity, suggesting that their physics are largely correct. Since the magnetosphere is a largely externally driven system, most model-measurement discrepancies probably arise from uncertain boundary conditions. So, with realistic distributions of solar wind parameters to establish its boundary conditions, we use the Lyon-Fedder-Mobarry (LFM) geospace model to build a synthetic multivariate statistical model of gridded ground magnetic disturbance. From this, we analyze the spatial modes of geomagnetic response, regress on available measurements to fill in unsampled locations on the grid, and estimate the global probability distribution of extreme magnetic disturbance. The latter offers a prototype geomagnetic "hazard map", similar to those used to characterize better-known geophysical hazards like earthquakes and floods.
NASA Astrophysics Data System (ADS)
Bastianello, Alvise; Piroli, Lorenzo; Calabrese, Pasquale
2018-05-01
We derive exact analytic expressions for the n -body local correlations in the one-dimensional Bose gas with contact repulsive interactions (Lieb-Liniger model) in the thermodynamic limit. Our results are valid for arbitrary states of the model, including ground and thermal states, stationary states after a quantum quench, and nonequilibrium steady states arising in transport settings. Calculations for these states are explicitly presented and physical consequences are critically discussed. We also show that the n -body local correlations are directly related to the full counting statistics for the particle-number fluctuations in a short interval, for which we provide an explicit analytic result.
Polymer models of interphase chromosomes
Vasquez, Paula A; Bloom, Kerry
2014-01-01
Clear organizational patterns on the genome have emerged from the statistics of population studies of fixed cells. However, how these results translate into the dynamics of individual living cells remains unexplored. We use statistical mechanics models derived from polymer physics to inquire into the effects that chromosome properties and dynamics have in the temporal and spatial behavior of the genome. Overall, changes in the properties of individual chains affect the behavior of all other chains in the domain. We explore two modifications of chain behavior: single chain motion and chain-chain interactions. We show that there is not a direct relation between these effects, as increase in motion, doesn’t necessarily translate into an increase on chain interaction. PMID:25482191
Statistical Deviations From the Theoretical Only-SBU Model to Estimate MCU Rates in SRAMs
NASA Astrophysics Data System (ADS)
Franco, Francisco J.; Clemente, Juan Antonio; Baylac, Maud; Rey, Solenne; Villa, Francesca; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul
2017-08-01
This paper addresses a well-known problem that occurs when memories are exposed to radiation: the determination if a bit flip is isolated or if it belongs to a multiple event. As it is unusual to know the physical layout of the memory, this paper proposes to evaluate the statistical properties of the sets of corrupted addresses and to compare the results with a mathematical prediction model where all of the events are single bit upsets. A set of rules easy to implement in common programming languages can be iteratively applied if anomalies are observed, thus yielding a classification of errors quite closer to reality (more than 80% accuracy in our experiments).
NASA Astrophysics Data System (ADS)
Mfumu Kihumba, Antoine; Ndembo Longo, Jean; Vanclooster, Marnik
2016-03-01
A multivariate statistical modelling approach was applied to explain the anthropogenic pressure of nitrate pollution on the Kinshasa groundwater body (Democratic Republic of Congo). Multiple regression and regression tree models were compared and used to identify major environmental factors that control the groundwater nitrate concentration in this region. The analyses were made in terms of physical attributes related to the topography, land use, geology and hydrogeology in the capture zone of different groundwater sampling stations. For the nitrate data, groundwater datasets from two different surveys were used. The statistical models identified the topography, the residential area, the service land (cemetery), and the surface-water land-use classes as major factors explaining nitrate occurrence in the groundwater. Also, groundwater nitrate pollution depends not on one single factor but on the combined influence of factors representing nitrogen loading sources and aquifer susceptibility characteristics. The groundwater nitrate pressure was better predicted with the regression tree model than with the multiple regression model. Furthermore, the results elucidated the sensitivity of the model performance towards the method of delineation of the capture zones. For pollution modelling at the monitoring points, therefore, it is better to identify capture-zone shapes based on a conceptual hydrogeological model rather than to adopt arbitrary circular capture zones.
Student Understanding of Taylor Series Expansions in Statistical Mechanics
ERIC Educational Resources Information Center
Smith, Trevor I.; Thompson, John R.; Mountcastle, Donald B.
2013-01-01
One goal of physics instruction is to have students learn to make physical meaning of specific mathematical expressions, concepts, and procedures in different physical settings. As part of research investigating student learning in statistical physics, we are developing curriculum materials that guide students through a derivation of the Boltzmann…
Johnson, Jason K.; Oyen, Diane Adele; Chertkov, Michael; ...
2016-12-01
Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering. However, exact inference is intractable in general graphical models, which suggests the problem of seeking the best approximation to a collection of random variables within some tractable family of graphical models. In this paper, we focus on the class of planar Ising models, for which exact inference is tractable using techniques of statistical physics. Based on these techniques and recent methods for planarity testing and planar embedding, we propose a greedy algorithm for learning the bestmore » planar Ising model to approximate an arbitrary collection of binary random variables (possibly from sample data). Given the set of all pairwise correlations among variables, we select a planar graph and optimal planar Ising model defined on this graph to best approximate that set of correlations. Finally, we demonstrate our method in simulations and for two applications: modeling senate voting records and identifying geo-chemical depth trends from Mars rover data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jason K.; Oyen, Diane Adele; Chertkov, Michael
Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering. However, exact inference is intractable in general graphical models, which suggests the problem of seeking the best approximation to a collection of random variables within some tractable family of graphical models. In this paper, we focus on the class of planar Ising models, for which exact inference is tractable using techniques of statistical physics. Based on these techniques and recent methods for planarity testing and planar embedding, we propose a greedy algorithm for learning the bestmore » planar Ising model to approximate an arbitrary collection of binary random variables (possibly from sample data). Given the set of all pairwise correlations among variables, we select a planar graph and optimal planar Ising model defined on this graph to best approximate that set of correlations. Finally, we demonstrate our method in simulations and for two applications: modeling senate voting records and identifying geo-chemical depth trends from Mars rover data.« less
NASA Astrophysics Data System (ADS)
2014-03-01
The second International Conference on Mathematical Modeling in Physical Sciences (IC-MSQUARE) took place at Prague, Czech Republic, from Sunday 1 September to Thursday 5 September 2013. The Conference was attended by more than 280 participants and hosted about 400 oral, poster, and virtual presentations while counted more than 600 pre-registered authors. The second IC-MSQUARE consisted of different and diverging workshops and thus covered various research fields where Mathematical Modeling is used, such as Theoretical/Mathematical Physics, Neutrino Physics, Non-Integrable Systems, Dynamical Systems, Computational Nanoscience, Biological Physics, Computational Biomechanics, Complex Networks, Stochastic Modeling, Fractional Statistics, DNA Dynamics, Macroeconomics. The scientific program was rather heavy since after the Keynote and Invited Talks in the morning, three parallel sessions were running every day. However, according to all attendees, the program was excellent with high level of talks and the scientific environment was fruitful, thus all attendees had a creative time. We would like to thank the Keynote Speaker and the Invited Speakers for their significant contribution to IC-MSQUARE. We also would like to thank the Members of the International Advisory and Scientific Committees as well as the Members of the Organizing Committee. Further information on the editors, speakers and committees is available in the attached pdf.
NASA Astrophysics Data System (ADS)
Cheng, Meng; Tantivasadakarn, Nathanan; Wang, Chenjie
2018-01-01
We study Abelian braiding statistics of loop excitations in three-dimensional gauge theories with fermionic particles and the closely related problem of classifying 3D fermionic symmetry-protected topological (FSPT) phases with unitary symmetries. It is known that the two problems are related by turning FSPT phases into gauge theories through gauging the global symmetry of the former. We show that there exist certain types of Abelian loop braiding statistics that are allowed only in the presence of fermionic particles, which correspond to 3D "intrinsic" FSPT phases, i.e., those that do not stem from bosonic SPT phases. While such intrinsic FSPT phases are ubiquitous in 2D systems and in 3D systems with antiunitary symmetries, their existence in 3D systems with unitary symmetries was not confirmed previously due to the fact that strong interaction is necessary to realize them. We show that the simplest unitary symmetry to support 3D intrinsic FSPT phases is Z2×Z4. To establish the results, we first derive a complete set of physical constraints on Abelian loop braiding statistics. Solving the constraints, we obtain all possible Abelian loop braiding statistics in 3D gauge theories, including those that correspond to intrinsic FSPT phases. Then, we construct exactly soluble state-sum models to realize the loop braiding statistics. These state-sum models generalize the well-known Crane-Yetter and Dijkgraaf-Witten models.
Statistical variances of diffusional properties from ab initio molecular dynamics simulations
NASA Astrophysics Data System (ADS)
He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei
2018-12-01
Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.
Physical activity, sedentary behavior, and vitamin D metabolites.
Hibler, Elizabeth A; Sardo Molmenti, Christine L; Dai, Qi; Kohler, Lindsay N; Warren Anderson, Shaneda; Jurutka, Peter W; Jacobs, Elizabeth T
2016-02-01
Physical activity is associated with circulating 25-hydroxyvitamin D (25(OH)D). However, the influence of activity and/or sedentary behavior on the biologically active, seco-steroid hormone 1α,25-dihydroxyvitamin D (1,25(OH)2D) is unknown. We conducted a cross-sectional analysis among ursodeoxycholic acid (UDCA) randomized trial participants (n=876) to evaluate associations between physical activity, sedentary behavior, and circulating vitamin D metabolite concentrations. Continuous vitamin D metabolite measurements and clinical thresholds were evaluated using multiple linear and logistic regression models, mutually adjusted for either 1,25(OH)2D or 25(OH)D and additional confounding factors. A statistically significant linear association between 1,25(OH)2D and moderate-vigorous physical activity per week was strongest among women (β (95% CI): 3.10 (1.51-6.35)) versus men (β (95% CI): 1.35 (0.79-2.29)) in the highest tertile of activity compared to the lowest (p-interaction=0.003). Furthermore, 25(OH)D was 1.54ng/ml (95% CI 1.09-1.98) higher per hour increase in moderate-vigorous activity (p=0.001) and odds of sufficient 25(OH)D status was higher among physically active participants (p=0.001). Sedentary behavior was not significantly associated with either metabolite in linear regression models, nor was a statistically significant interaction by sex identified. The current study identified novel associations between physical activity and serum 1,25(OH)2D levels, adjusted for 25(OH)D concentrations. These results identify the biologically active form of vitamin D as a potential physiologic mechanism related to observed population-level associations between moderate-vigorous physical activity with bone health and chronic disease risk. However, future longitudinal studies are needed to further evaluate the role of physical activity and vitamin D metabolites in chronic disease prevention. Copyright © 2015 Elsevier Inc. All rights reserved.
Eutrophication risk assessment in coastal embayments using simple statistical models.
Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G
2003-09-01
A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.
NASA Astrophysics Data System (ADS)
Nold, Andreas; Goddard, Ben; Sibley, David; Kalliadasis, Serafim
2014-03-01
Multiscale effects play a predominant role in wetting phenomena such as the moving contact line. An accurate description is of paramount interest for a wide range of industrial applications, yet it is a matter of ongoing research, due to the difficulty of incorporating different physical effects in one model. Important small-scale phenomena are corrections to the attractive fluid-fluid and wall-fluid forces in inhomogeneous density distributions, which often previously have been accounted for by the disjoining pressure in an ad-hoc manner. We systematically derive a novel model for the description of a single-component liquid-vapor multiphase system which inherently incorporates these nonlocal effects. This derivation, which is inspired by statistical mechanics in the framework of colloidal density functional theory, is critically discussed with respect to its assumptions and restrictions. The model is then employed numerically to study a moving contact line of a liquid fluid displacing its vapor phase. We show how nonlocal physical effects are inherently incorporated by the model and describe how classical macroscopic results for the contact line motion are retrieved. We acknowledge financial support from ERC Advanced Grant No. 247031 and Imperial College through a DTG International Studentship.
Janik, M; Bossew, P; Kurihara, O
2018-07-15
Machine learning is a class of statistical techniques which has proven to be a powerful tool for modelling the behaviour of complex systems, in which response quantities depend on assumed controls or predictors in a complicated way. In this paper, as our first purpose, we propose the application of machine learning to reconstruct incomplete or irregularly sampled data of time series indoor radon ( 222 Rn). The physical assumption underlying the modelling is that Rn concentration in the air is controlled by environmental variables such as air temperature and pressure. The algorithms "learn" from complete sections of multivariate series, derive a dependence model and apply it to sections where the controls are available, but not the response (Rn), and in this way complete the Rn series. Three machine learning techniques are applied in this study, namely random forest, its extension called the gradient boosting machine and deep learning. For a comparison, we apply the classical multiple regression in a generalized linear model version. Performance of the models is evaluated through different metrics. The performance of the gradient boosting machine is found to be superior to that of the other techniques. By applying learning machines, we show, as our second purpose, that missing data or periods of Rn series data can be reconstructed and resampled on a regular grid reasonably, if data of appropriate physical controls are available. The techniques also identify to which degree the assumed controls contribute to imputing missing Rn values. Our third purpose, though no less important from the viewpoint of physics, is identifying to which degree physical, in this case environmental variables, are relevant as Rn predictors, or in other words, which predictors explain most of the temporal variability of Rn. We show that variables which contribute most to the Rn series reconstruction, are temperature, relative humidity and day of the year. The first two are physical predictors, while "day of the year" is a statistical proxy or surrogate for missing or unknown predictors. Copyright © 2018 Elsevier B.V. All rights reserved.
Non-gaussianity versus nonlinearity of cosmological perturbations.
Verde, L
2001-06-01
Following the discovery of the cosmic microwave background, the hot big-bang model has become the standard cosmological model. In this theory, small primordial fluctuations are subsequently amplified by gravity to form the large-scale structure seen today. Different theories for unified models of particle physics, lead to different predictions for the statistical properties of the primordial fluctuations, that can be divided in two classes: gaussian and non-gaussian. Convincing evidence against or for gaussian initial conditions would rule out many scenarios and point us toward a physical theory for the origin of structures. The statistical distribution of cosmological perturbations, as we observe them, can deviate from the gaussian distribution in several different ways. Even if perturbations start off gaussian, nonlinear gravitational evolution can introduce non-gaussian features. Additionally, our knowledge of the Universe comes principally from the study of luminous material such as galaxies, but galaxies might not be faithful tracers of the underlying mass distribution. The relationship between fluctuations in the mass and in the galaxies distribution (bias), is often assumed to be local, but could well be nonlinear. Moreover, galaxy catalogues use the redshift as third spatial coordinate: the resulting redshift-space map of the galaxy distribution is nonlinearly distorted by peculiar velocities. Nonlinear gravitational evolution, biasing, and redshift-space distortion introduce non-gaussianity, even in an initially gaussian fluctuation field. I investigate the statistical tools that allow us, in principle, to disentangle the above different effects, and the observational datasets we require to do so in practice.
Reconciling intuitive physics and Newtonian mechanics for colliding objects.
Sanborn, Adam N; Mansinghka, Vikash K; Griffiths, Thomas L
2013-04-01
People have strong intuitions about the influence objects exert upon one another when they collide. Because people's judgments appear to deviate from Newtonian mechanics, psychologists have suggested that people depend on a variety of task-specific heuristics. This leaves open the question of how these heuristics could be chosen, and how to integrate them into a unified model that can explain human judgments across a wide range of physical reasoning tasks. We propose an alternative framework, in which people's judgments are based on optimal statistical inference over a Newtonian physical model that incorporates sensory noise and intrinsic uncertainty about the physical properties of the objects being viewed. This noisy Newton framework can be applied to a multitude of judgments, with people's answers determined by the uncertainty they have for physical variables and the constraints of Newtonian mechanics. We investigate a range of effects in mass judgments that have been taken as strong evidence for heuristic use and show that they are well explained by the interplay between Newtonian constraints and sensory uncertainty. We also consider an extended model that handles causality judgments, and obtain good quantitative agreement with human judgments across tasks that involve different judgment types with a single consistent set of parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Winfield, Kari A.
2005-01-01
Because characterizing the unsaturated hydraulic properties of sediments over large areas or depths is costly and time consuming, development of models that predict these properties from more easily measured bulk-physical properties is desirable. At the Idaho National Engineering and Environmental Laboratory, the unsaturated zone is composed of thick basalt flow sequences interbedded with thinner sedimentary layers. Determining the unsaturated hydraulic properties of sedimentary layers is one step in understanding water flow and solute transport processes through this complex unsaturated system. Multiple linear regression was used to construct simple property-transfer models for estimating the water-retention curve and saturated hydraulic conductivity of deep sediments at the Idaho National Engineering and Environmental Laboratory. The regression models were developed from 109 core sample subsets with laboratory measurements of hydraulic and bulk-physical properties. The core samples were collected at depths of 9 to 175 meters at two facilities within the southwestern portion of the Idaho National Engineering and Environmental Laboratory-the Radioactive Waste Management Complex, and the Vadose Zone Research Park southwest of the Idaho Nuclear Technology and Engineering Center. Four regression models were developed using bulk-physical property measurements (bulk density, particle density, and particle size) as the potential explanatory variables. Three representations of the particle-size distribution were compared: (1) textural-class percentages (gravel, sand, silt, and clay), (2) geometric statistics (mean and standard deviation), and (3) graphical statistics (median and uniformity coefficient). The four response variables, estimated from linear combinations of the bulk-physical properties, included saturated hydraulic conductivity and three parameters that define the water-retention curve. For each core sample,values of each water-retention parameter were estimated from the appropriate regression equation and used to calculate an estimated water-retention curve. The degree to which the estimated curve approximated the measured curve was quantified using a goodness-of-fit indicator, the root-mean-square error. Comparison of the root-mean-square-error distributions for each alternative particle-size model showed that the estimated water-retention curves were insensitive to the way the particle-size distribution was represented. Bulk density, the median particle diameter, and the uniformity coefficient were chosen as input parameters for the final models. The property-transfer models developed in this study allow easy determination of hydraulic properties without need for their direct measurement. Additionally, the models provide the basis for development of theoretical models that rely on physical relationships between the pore-size distribution and the bulk-physical properties of the media. With this adaptation, the property-transfer models should have greater application throughout the Idaho National Engineering and Environmental Laboratory and other geographic locations.
NASA Astrophysics Data System (ADS)
Turki, Imen; Laignel, Benoit; Kakeh, Nabil; Chevalier, Laetitia; Costa, Stephane
2015-04-01
This research is carried out in the framework of the program Surface Water and Ocean Topography (SWOT) which is a partnership between NASA and CNES. Here, a new hybrid model is implemented for filling gaps and forecasting the hourly sea level variability by combining classical harmonic analyses to high statistical methods to reproduce the deterministic and stochastic processes, respectively. After simulating the mean trend sea level and astronomical tides, the nontidal residual surges are investigated using an autoregressive moving average (ARMA) methods by two ways: (1) applying a purely statistical approach and (2) introducing the SLP in ARMA as a main physical process driving the residual sea level. The new hybrid model is applied to the western Atlantic sea and the eastern English Channel. Using ARMA model and considering the SLP, results show that the hourly sea level observations of gauges with are well reproduced with a root mean square error (RMSE) ranging between 4.5 and 7 cm for 1 to 30 days of gaps and an explained variance more than 80 %. For larger gaps of months, the RMSE reaches 9 cm. The negative and the positive extreme values of sea levels are also well reproduced with a mean explained variance between 70 and 85 %. The statistical behavior of 1-year modeled residual components shows good agreements with observations. The frequency analysis using the discrete wavelet transform illustrate strong correlations between observed and modeled energy spectrum and the bands of variability. Accordingly, the proposed model presents a coherent, simple, and easy tool to estimate the total sea level at timescales from days to months. The ARMA model seems to be more promising for filling gaps and estimating the sea level at larger scales of years by introducing more physical processes driving its stochastic variability.
The statistical analysis of global climate change studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, J.W.
1992-01-01
The focus of this work is to contribute to the enhancement of the relationship between climatologists and statisticians. The analysis of global change data has been underway for many years by atmospheric scientists. Much of this analysis includes a heavy reliance on statistics and statistical inference. Some specific climatological analyses are presented and the dependence on statistics is documented before the analysis is undertaken. The first problem presented involves the fluctuation-dissipation theorem and its application to global climate models. This problem has a sound theoretical niche in the literature of both climate modeling and physics, but a statistical analysis inmore » which the data is obtained from the model to show graphically the relationship has not been undertaken. It is under this motivation that the author presents this problem. A second problem concerning the standard errors in estimating global temperatures is purely statistical in nature although very little materials exists for sampling on such a frame. This problem not only has climatological and statistical ramifications, but political ones as well. It is planned to use these results in a further analysis of global warming using actual data collected on the earth. In order to simplify the analysis of these problems, the development of a computer program, MISHA, is presented. This interactive program contains many of the routines, functions, graphics, and map projections needed by the climatologist in order to effectively enter the arena of data visualization.« less
Perrin, Paul B; Paredes, Alejandra Morlett; Olivera, Silvia Leonor; Lozano, Juan Esteban; Leal, Wendy Tatiana; Ahmad, Usman F; Arango-Lasprilla, Juan Carlos
2017-01-01
Research has begun to document the bivariate connections between pain in individuals with spinal cord injury (SCI) and various aspects of health related quality of life (HRQOL), such as fatigue, social functioning, mental health, and physical functioning. The purpose of this study was to construct and test a theoretical path model illuminating the stage-wise and sequential (cascading) HRQOL pathways through which pain increases physical disability in individuals with SCI in a sample from Colombia, South America. It was hypothesized that increased pain would lead to decreased energy, which would lead to decreased mental health and social functioning, which both would lead to emotional role limitations, which finally would lead to physical role limitations. A cross-sectional study assessed individuals with SCI (n = 40) in Neiva, Colombia. Participants completed a measure indexing various aspects of HRQOL. The path model overall showed excellent fit indices, and each individual path within the model was statistically significant. Pain exerted significant indirect effects through all possible mediators in the model, ultimately suggesting that energy, mental health, social functioning, and role limitations-emotional were likely pathways through which pain exerted its effects on physical disability in individuals with SCI. These findings uncover several potential nodes for clinical intervention which if targeted in the context of rehabilitation or outpatient services, could result in salubrious direct and indirect effects reverberating down the theoretical causal chain and ultimately reducing physical disability in individuals with SCI.
Lovejoy, S; de Lima, M I P
2015-07-01
Over the range of time scales from about 10 days to 30-100 years, in addition to the familiar weather and climate regimes, there is an intermediate "macroweather" regime characterized by negative temporal fluctuation exponents: implying that fluctuations tend to cancel each other out so that averages tend to converge. We show theoretically and numerically that macroweather precipitation can be modeled by a stochastic weather-climate model (the Climate Extended Fractionally Integrated Flux, model, CEFIF) first proposed for macroweather temperatures and we show numerically that a four parameter space-time CEFIF model can approximately reproduce eight or so empirical space-time exponents. In spite of this success, CEFIF is theoretically and numerically difficult to manage. We therefore propose a simplified stochastic model in which the temporal behavior is modeled as a fractional Gaussian noise but the spatial behaviour as a multifractal (climate) cascade: a spatial extension of the recently introduced ScaLIng Macroweather Model, SLIMM. Both the CEFIF and this spatial SLIMM model have a property often implicitly assumed by climatologists that climate statistics can be "homogenized" by normalizing them with the standard deviation of the anomalies. Physically, it means that the spatial macroweather variability corresponds to different climate zones that multiplicatively modulate the local, temporal statistics. This simplified macroweather model provides a framework for macroweather forecasting that exploits the system's long range memory and spatial correlations; for it, the forecasting problem has been solved. We test this factorization property and the model with the help of three centennial, global scale precipitation products that we analyze jointly in space and in time.
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2017-06-01
The history of mathematical modeling outside physics has been dominated by the use of classical mathematical models, C-models, primarily those of a probabilistic or statistical nature. More recently, however, quantum mathematical models, Q-models, based in the mathematical formalism of quantum theory have become more prominent in psychology, economics, and decision science. The use of Q-models in these fields remains controversial, in part because it is not entirely clear whether Q-models are necessary for dealing with the phenomena in question or whether C-models would still suffice. My aim, however, is not to assess the necessity of Q-models in these fields, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena there, vis-à-vis quantum phenomena in physics. In order to do so, I shall first discuss the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that led to the development of quantum mechanics. Then I shall consider a possible role of similar principles in using Q-models outside physics. Psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own internal principles, while quantum mechanics was derived from such principles, because there was no readily available mathematical model to handle quantum phenomena, although the mathematics ultimately used in quantum did in fact exist then. I shall argue, however, that the principle perspective on mathematical modeling outside physics might help us to understand better the role of Q-models in these fields and possibly to envision new models, conceptually analogous to but mathematically different from those of quantum theory, helpful or even necessary there or in physics itself. I shall suggest one possible type of such models, singularized probabilistic, SP, models, some of which are time-dependent, TDSP-models. The necessity of using such models may change the nature of mathematical modeling in science and, thus, the nature of science, as it happened in the case of Q-models, which not only led to a revolutionary transformation of physics but also opened new possibilities for scientific thinking and mathematical modeling beyond physics.
Continuum-Kinetic Models and Numerical Methods for Multiphase Applications
NASA Astrophysics Data System (ADS)
Nault, Isaac Michael
This thesis presents a continuum-kinetic approach for modeling general problems in multiphase solid mechanics. In this context, a continuum model refers to any model, typically on the macro-scale, in which continuous state variables are used to capture the most important physics: conservation of mass, momentum, and energy. A kinetic model refers to any model, typically on the meso-scale, which captures the statistical motion and evolution of microscopic entitites. Multiphase phenomena usually involve non-negligible micro or meso-scopic effects at the interfaces between phases. The approach developed in the thesis attempts to combine the computational performance benefits of a continuum model with the physical accuracy of a kinetic model when applied to a multiphase problem. The approach is applied to modeling a single particle impact in Cold Spray, an engineering process that intimately involves the interaction of crystal grains with high-magnitude elastic waves. Such a situation could be classified a multiphase application due to the discrete nature of grains on the spatial scale of the problem. For this application, a hyper elasto-plastic model is solved by a finite volume method with approximate Riemann solver. The results of this model are compared for two types of plastic closure: a phenomenological macro-scale constitutive law, and a physics-based meso-scale Crystal Plasticity model.
Applications of modern statistical methods to analysis of data in physical science
NASA Astrophysics Data System (ADS)
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.
Re-entry survivability and risk
NASA Astrophysics Data System (ADS)
Fudge, Michael L.
1998-11-01
This paper is the culmination of the research effort which was reported on last year while still in-progress. As previously reported, statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by reentering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was demonstrated in dramatic fashion in January 1997 by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This report details reentry survivability estimation methodology, including the specific methodology used by ITT Systems' (formerly Kaman Sciences) 'SURVIVE' model. The major change to the model in the last twelve months has been the increase in the fidelity with which upper- atmospheric aerodynamics has been modeled. This has resulted in an adjustment in the factor relating the amount of kinetic energy loss to the amount of heating entering and reentering body, and also validated and removed the necessity for certain empirically-based adjustments made to the theoretical heating expressions. Comparisons between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and SURVIVE estimates are presented for selected generic upper stage or spacecraft components, a Soyuz launch vehicle second stage, and for a Delta II launch vehicle second stage and its significant components. Significant similarity is demonstrated between the type and dispersion pattern of the recovered debris from the January 1997 Delta II 2nd stage event and the simulation of that reentry and breakup.
The ambiguity of simplicity in quantum and classical simulation
NASA Astrophysics Data System (ADS)
Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.
2017-04-01
A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.
The contribution of statistical physics to evolutionary biology.
de Vladar, Harold P; Barton, Nicholas H
2011-08-01
Evolutionary biology shares many concepts with statistical physics: both deal with populations, whether of molecules or organisms, and both seek to simplify evolution in very many dimensions. Often, methodologies have undergone parallel and independent development, as with stochastic methods in population genetics. Here, we discuss aspects of population genetics that have embraced methods from physics: non-equilibrium statistical mechanics, travelling waves and Monte-Carlo methods, among others, have been used to study polygenic evolution, rates of adaptation and range expansions. These applications indicate that evolutionary biology can further benefit from interactions with other areas of statistical physics; for example, by following the distribution of paths taken by a population through time. Copyright © 2011 Elsevier Ltd. All rights reserved.
Development and evaluation of social cognitive measures related to adolescent physical activity.
Dewar, Deborah L; Lubans, David Revalds; Morgan, Philip James; Plotnikoff, Ronald C
2013-05-01
This study aimed to develop and evaluate the construct validity and reliability of modernized social cognitive measures relating to physical activity behaviors in adolescents. An instrument was developed based on constructs from Bandura's Social Cognitive Theory and included the following scales: self-efficacy, situation (perceived physical environment), social support, behavioral strategies, and outcome expectations and expectancies. The questionnaire was administered in a sample of 171 adolescents (age = 13.6 ± 1.2 years, females = 61%). Confirmatory factor analysis was employed to examine model-fit for each scale using multiple indices, including chi-square index, comparative-fit index (CFI), goodness-of-fit index (GFI), and the root mean square error of approximation (RMSEA). Reliability properties were also examined (ICC and Cronbach's alpha). Each scale represented a statistically sound measure: fit indices indicated each model to be an adequate-to-exact fit to the data; internal consistency was acceptable to good (α = 0.63-0.79); rank order repeatability was strong (ICC = 0.82-0.91). Results support the validity and reliability of social cognitive scales relating to physical activity among adolescents. As such, the developed scales have utility for the identification of potential social cognitive correlates of youth physical activity, mediators of physical activity behavior changes and the testing of theoretical models based on Social Cognitive Theory.
Choi, Se Y; Ahn, Seung H; Choi, Jae D; Kim, Jung H; Lee, Byoung-Il; Kim, Jeong-In
2016-01-01
Objective: The purpose of this study was to compare CT image quality for evaluating urolithiasis using filtered back projection (FBP), statistical iterative reconstruction (IR) and knowledge-based iterative model reconstruction (IMR) according to various scan parameters and radiation doses. Methods: A 5 × 5 × 5 mm3 uric acid stone was placed in a physical human phantom at the level of the pelvis. 3 tube voltages (120, 100 and 80 kV) and 4 current–time products (100, 70, 30 and 15 mAs) were implemented in 12 scans. Each scan was reconstructed with FBP, statistical IR (Levels 5–7) and knowledge-based IMR (soft-tissue Levels 1–3). The radiation dose, objective image quality and signal-to-noise ratio (SNR) were evaluated, and subjective assessments were performed. Results: The effective doses ranged from 0.095 to 2.621 mSv. Knowledge-based IMR showed better objective image noise and SNR than did FBP and statistical IR. The subjective image noise of FBP was worse than that of statistical IR and knowledge-based IMR. The subjective assessment scores deteriorated after a break point of 100 kV and 30 mAs. Conclusion: At the setting of 100 kV and 30 mAs, the radiation dose can be decreased by approximately 84% while keeping the subjective image assessment. Advances in knowledge: Patients with urolithiasis can be evaluated with ultralow-dose non-enhanced CT using a knowledge-based IMR algorithm at a substantially reduced radiation dose with the imaging quality preserved, thereby minimizing the risks of radiation exposure while providing clinically relevant diagnostic benefits for patients. PMID:26577542
NASA Astrophysics Data System (ADS)
Challet, Damien; Marsili, M.; Ottino, Gabriele
2004-02-01
We mathematize El Farol bar problem and transform it into a workable model. We find general conditions on the predictor space under which the convergence of the average attendance to the resource level does not require any intelligence on the side of the agents. Secondly, specializing to a particular ensemble of continuous strategies yields a model similar to the Minority Game. Statistical physics of disordered systems allows us to derive a complete understanding of the complex behavior of this model, on the basis of its phase diagram.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
Ipsen, Andreas
2017-02-03
Here, the mass peak centroid is a quantity that is at the core of mass spectrometry (MS). However, despite its central status in the field, models of its statistical distribution are often chosen quite arbitrarily and without attempts at establishing a proper theoretical justification for their use. Recent work has demonstrated that for mass spectrometers employing analog-to-digital converters (ADCs) and electron multipliers, the statistical distribution of the mass peak intensity can be described via a relatively simple model derived essentially from first principles. Building on this result, the following article derives the corresponding statistical distribution for the mass peak centroidsmore » of such instruments. It is found that for increasing signal strength, the centroid distribution converges to a Gaussian distribution whose mean and variance are determined by physically meaningful parameters and which in turn determine bias and variability of the m/z measurements of the instrument. Through the introduction of the concept of “pulse-peak correlation”, the model also elucidates the complicated relationship between the shape of the voltage pulses produced by the preamplifier and the mean and variance of the centroid distribution. The predictions of the model are validated with empirical data and with Monte Carlo simulations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ipsen, Andreas
Here, the mass peak centroid is a quantity that is at the core of mass spectrometry (MS). However, despite its central status in the field, models of its statistical distribution are often chosen quite arbitrarily and without attempts at establishing a proper theoretical justification for their use. Recent work has demonstrated that for mass spectrometers employing analog-to-digital converters (ADCs) and electron multipliers, the statistical distribution of the mass peak intensity can be described via a relatively simple model derived essentially from first principles. Building on this result, the following article derives the corresponding statistical distribution for the mass peak centroidsmore » of such instruments. It is found that for increasing signal strength, the centroid distribution converges to a Gaussian distribution whose mean and variance are determined by physically meaningful parameters and which in turn determine bias and variability of the m/z measurements of the instrument. Through the introduction of the concept of “pulse-peak correlation”, the model also elucidates the complicated relationship between the shape of the voltage pulses produced by the preamplifier and the mean and variance of the centroid distribution. The predictions of the model are validated with empirical data and with Monte Carlo simulations.« less
A network model for characterizing brine channels in sea ice
NASA Astrophysics Data System (ADS)
Lieblappen, Ross M.; Kumar, Deip D.; Pauls, Scott D.; Obbard, Rachel W.
2018-03-01
The brine pore space in sea ice can form complex connected structures whose geometry is critical in the governance of important physical transport processes between the ocean, sea ice, and surface. Recent advances in three-dimensional imaging using X-ray micro-computed tomography have enabled the visualization and quantification of the brine network morphology and variability. Using imaging of first-year sea ice samples at in situ temperatures, we create a new mathematical network model to characterize the topology and connectivity of the brine channels. This model provides a statistical framework where we can characterize the pore networks via two parameters, depth and temperature, for use in dynamical sea ice models. Our approach advances the quantification of brine connectivity in sea ice, which can help investigations of bulk physical properties, such as fluid permeability, that are key in both global and regional sea ice models.
Climate Change: Modeling the Human Response
NASA Astrophysics Data System (ADS)
Oppenheimer, M.; Hsiang, S. M.; Kopp, R. E.
2012-12-01
Integrated assessment models have historically relied on forward modeling including, where possible, process-based representations to project climate change impacts. Some recent impact studies incorporate the effects of human responses to initial physical impacts, such as adaptation in agricultural systems, migration in response to drought, and climate-related changes in worker productivity. Sometimes the human response ameliorates the initial physical impacts, sometimes it aggravates it, and sometimes it displaces it onto others. In these arenas, understanding of underlying socioeconomic mechanisms is extremely limited. Consequently, for some sectors where sufficient data has accumulated, empirically based statistical models of human responses to past climate variability and change have been used to infer response sensitivities which may apply under certain conditions to future impacts, allowing a broad extension of integrated assessment into the realm of human adaptation. We discuss the insights gained from and limitations of such modeling for benefit-cost analysis of climate change.
Finding Bounded Rational Equilibria. Part 2; Alternative Lagrangians and Uncountable Move Spaces
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2004-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights &om the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.
Understanding amyloid aggregation by statistical analysis of atomic force microscopy images
NASA Astrophysics Data System (ADS)
Adamcik, Jozef; Jung, Jin-Mi; Flakowski, Jérôme; de Los Rios, Paolo; Dietler, Giovanni; Mezzenga, Raffaele
2010-06-01
The aggregation of proteins is central to many aspects of daily life, including food processing, blood coagulation, eye cataract formation disease and prion-related neurodegenerative infections. However, the physical mechanisms responsible for amyloidosis-the irreversible fibril formation of various proteins that is linked to disorders such as Alzheimer's, Creutzfeldt-Jakob and Huntington's diseases-have not yet been fully elucidated. Here, we show that different stages of amyloid aggregation can be examined by performing a statistical polymer physics analysis of single-molecule atomic force microscopy images of heat-denatured β-lactoglobulin fibrils. The atomic force microscopy analysis, supported by theoretical arguments, reveals that the fibrils have a multistranded helical shape with twisted ribbon-like structures. Our results also indicate a possible general model for amyloid fibril assembly and illustrate the potential of this approach for investigating fibrillar systems.
Cocco, Simona; Leibler, Stanislas; Monasson, Rémi
2009-01-01
Complexity of neural systems often makes impracticable explicit measurements of all interactions between their constituents. Inverse statistical physics approaches, which infer effective couplings between neurons from their spiking activity, have been so far hindered by their computational complexity. Here, we present 2 complementary, computationally efficient inverse algorithms based on the Ising and “leaky integrate-and-fire” models. We apply those algorithms to reanalyze multielectrode recordings in the salamander retina in darkness and under random visual stimulus. We find strong positive couplings between nearby ganglion cells common to both stimuli, whereas long-range couplings appear under random stimulus only. The uncertainty on the inferred couplings due to limitations in the recordings (duration, small area covered on the retina) is discussed. Our methods will allow real-time evaluation of couplings for large assemblies of neurons. PMID:19666487
Entropic Repulsion Between Fluctuating Surfaces
NASA Astrophysics Data System (ADS)
Janke, W.
The statistical mechanics of fluctuating surfaces plays an important role in a variety of physical systems, ranging from biological membranes to world sheets of strings in theories of fundamental interactions. In many applications it is a good approximation to assume that the surfaces possess no tension. Their statistical properties are then governed by curvature energies only, which allow for gigantic out-of-plane undulations. These fluctuations are the “entropic” origin of long-range repulsive forces in layered surface systems. Theoretical estimates of these forces for simple model surfaces are surveyed and compared with recent Monte Carlo simulations.
Production cost structure in US outpatient physical therapy health care.
Lubiani, Gregory G; Okunade, Albert A
2013-02-01
This paper investigates the technology cost structure in US physical therapy care. We exploit formal economic theories and a rich national data of providers to tease out implications for operational cost efficiencies. The 2008-2009 dataset comprising over 19 000 bi-weekly, site-specific physical therapy center observations across 28 US states and Occupational Employment Statistics data (Bureau of Labor Statistics) includes measures of output, three labor types (clinical, support, and administrative), and facilities (capital). We discuss findings from the iterative seemingly unrelated regression estimation system model. The generalized translog cost estimates indicate a well-behaved underlying technology structure. We also find the following: (i) factor demands are downwardly sloped; (ii) pair-wise factor relationships largely reflect substitutions; (iii) factor demand for physical therapists is more inelastic compared with that for administrative staff; and (iv) diminishing scale economies exist at the 25%, 50%, and 75% output (patient visits) levels. Our findings advance the timely economic understanding of operations in an increasingly important segment of the medical care sector that has, up-to-now (because of data paucity), been missing from healthcare efficiency analysis. Our work further provides baseline estimates for comparing operational efficiencies in physical therapy care after implementations of the 2010 US healthcare reforms. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Vallianatos, Filippos
2015-04-01
Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project. References F. Vallianatos, "A non-extensive approach to risk assessment", Nat. Hazards Earth Syst. Sci., 9, 211-216, 2009 F. Vallianatos and P. Sammonds "Is plate tectonics a case of non-extensive thermodynamics?" Physica A: Statistical Mechanics and its Applications, 389 (21), 4989-4993, 2010, F. Vallianatos, G. Michas, G. Papadakis and P. Sammonds " A non extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece)", Acta Geophysica, 60(3), 758-768, 2012 F. Vallianatos and L. Telesca, Statistical mechanics in earth physics and natural hazards (editorial), Acta Geophysica, 60, 3, 499-501, 2012 F. Vallianatos, G. Michas, G. Papadakis and A. Tzanis "Evidence of non-extensivity in the seismicity observed during the 2011-2012 unrest at the Santorini volcanic complex, Greece" Nat. Hazards Earth Syst. Sci.,13,177-185, 2013 F. Vallianatos and P. Sammonds, "Evidence of non-extensive statistical physics of the lithospheric instability approaching the 2004 Sumatran-Andaman and 2011 Honshu mega-earthquakes" Tectonophysics, 590 , 52-58, 2013 G. Papadakis, F. Vallianatos, P. Sammonds, " Evidence of Nonextensive Statistical Physics behavior of the Hellenic Subduction Zone seismicity" Tectonophysics, 608, 1037 -1048, 2013 G. Michas, F. Vallianatos, and P. Sammonds, Non-extensivity and long-range correlations in the earthquake activity at the West Corinth rift (Greece) Nonlin. Processes Geophys., 20, 713-724, 2013
NASA Astrophysics Data System (ADS)
Calderon, Christopher P.; Weiss, Lucien E.; Moerner, W. E.
2014-05-01
Experimental advances have improved the two- (2D) and three-dimensional (3D) spatial resolution that can be extracted from in vivo single-molecule measurements. This enables researchers to quantitatively infer the magnitude and directionality of forces experienced by biomolecules in their native environment. Situations where such force information is relevant range from mitosis to directed transport of protein cargo along cytoskeletal structures. Models commonly applied to quantify single-molecule dynamics assume that effective forces and velocity in the x ,y (or x ,y,z) directions are statistically independent, but this assumption is physically unrealistic in many situations. We present a hypothesis testing approach capable of determining if there is evidence of statistical dependence between positional coordinates in experimentally measured trajectories; if the hypothesis of independence between spatial coordinates is rejected, then a new model accounting for 2D (3D) interactions can and should be considered. Our hypothesis testing technique is robust, meaning it can detect interactions, even if the noise statistics are not well captured by the model. The approach is demonstrated on control simulations and on experimental data (directed transport of intraflagellar transport protein 88 homolog in the primary cilium).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Qingge; Song, Gian; Gorti, Sarma B.
Bragg-edge imaging, which is also known as neutron radiography, has recently emerged as a novel crystalline characterization technique. Modelling of this novel technique by incorporating various features of the underlying microstructure (including the crystallographic texture, the morphological texture, and the grain size) of the material remains a subject of considerable research and development. In this paper, Inconel 718 samples made by additive manufacturing were investigated by neutron diffraction and neutron radiography techniques. The specimen features strong morphological and crystallographic textures and a highly heterogeneous microstructure. A 3D statistical full-field model is introduced by taking details of the microstructure into accountmore » to understand the experimental neutron radiography results. The Bragg-edge imaging and the total cross section were calculated based on the neutron transmission physics. A good match was obtained between the model predictions and experimental results at different incident beam angles with respect to the sample build direction. The current theoretical approach has the ability to incorporate 3D spatially resolved microstructural heterogeneity information and shows promise in understanding the 2D neutron radiography of bulk samples. With further development to incorporate the heterogeneity in lattice strain in the model, it can be used as a powerful tool in the future to better understand the neutron radiography data.« less
Xie, Qingge; Song, Gian; Gorti, Sarma B.; ...
2018-02-21
Bragg-edge imaging, which is also known as neutron radiography, has recently emerged as a novel crystalline characterization technique. Modelling of this novel technique by incorporating various features of the underlying microstructure (including the crystallographic texture, the morphological texture, and the grain size) of the material remains a subject of considerable research and development. In this paper, Inconel 718 samples made by additive manufacturing were investigated by neutron diffraction and neutron radiography techniques. The specimen features strong morphological and crystallographic textures and a highly heterogeneous microstructure. A 3D statistical full-field model is introduced by taking details of the microstructure into accountmore » to understand the experimental neutron radiography results. The Bragg-edge imaging and the total cross section were calculated based on the neutron transmission physics. A good match was obtained between the model predictions and experimental results at different incident beam angles with respect to the sample build direction. The current theoretical approach has the ability to incorporate 3D spatially resolved microstructural heterogeneity information and shows promise in understanding the 2D neutron radiography of bulk samples. With further development to incorporate the heterogeneity in lattice strain in the model, it can be used as a powerful tool in the future to better understand the neutron radiography data.« less
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
Abbaspour, Seddigheh; Farmanbar, Rabiollah; Njafi, Fateme; Ghiasvand, Arezoo Mohamadkhani; Dehghankar, Leila
2017-01-01
Background Regular physical activity has been considered as health promotion, and identifying different effective psycho-social variables on physical has proven to be essential. Objective To identify the relationship between decisional balance and self-efficacy in physical activities using the transtheoretical model in the members of a retirement center in Rasht, Guillen. Methods A descriptive cross-sectional study was conducted in 2013 by using convenient sampling on 262 elderly people who are the members of retirement centers in Rasht. Data were collected using Stages of change, Decisional balance, Self-efficacy and Physical Activity Scale for the Elderly (PASE). Data was analyzed using SPSS-16 software, descriptive and analytic statistic (Pearson correlation, Spearman, ANOVA, HSD Tukey, linear and ordinal regression). Results The majority of participants were in maintenance stage. Mean and standard deviation physical activity for the elderly was 119.35±51.50. Stages of change and physical activities were significantly associated with decisional balance and self-efficacy (p<0.0001); however, cons had a significant and reverse association. According to linear and ordinal regression the only predicator variable of physical activity behavior was self-efficacy. Conclusion By increase in pros and self-efficacy on doing physical activity, it can be benefited in designing appropriate intervention programs. PMID:28713520
Physics Teachers and Students: A Statistical and Historical Analysis of Women
NASA Astrophysics Data System (ADS)
Gregory, Amanda
2009-10-01
Historically, women have been denied an education comparable to that available to men. Since women have been allowed into institutions of higher learning, they have been studying and earning physics degrees. The aim of this poster is to discuss the statistical relationship between the number of women enrolled in university physics programs and the number of female physics faculty members. Special care has been given to examining the statistical data in the context of the social climate at the time that these women were teaching or pursuing their education.
Entrainment in the master equation.
Margaliot, Michael; Grüne, Lars; Kriecherbauer, Thomas
2018-04-01
The master equation plays an important role in many scientific fields including physics, chemistry, systems biology, physical finance and sociodynamics. We consider the master equation with periodic transition rates. This may represent an external periodic excitation like the 24 h solar day in biological systems or periodic traffic lights in a model of vehicular traffic. Using tools from systems and control theory, we prove that under mild technical conditions every solution of the master equation converges to a periodic solution with the same period as the rates. In other words, the master equation entrains (or phase locks) to periodic excitations. We describe two applications of our theoretical results to important models from statistical mechanics and epidemiology.
Entrainment in the master equation
Grüne, Lars; Kriecherbauer, Thomas
2018-01-01
The master equation plays an important role in many scientific fields including physics, chemistry, systems biology, physical finance and sociodynamics. We consider the master equation with periodic transition rates. This may represent an external periodic excitation like the 24 h solar day in biological systems or periodic traffic lights in a model of vehicular traffic. Using tools from systems and control theory, we prove that under mild technical conditions every solution of the master equation converges to a periodic solution with the same period as the rates. In other words, the master equation entrains (or phase locks) to periodic excitations. We describe two applications of our theoretical results to important models from statistical mechanics and epidemiology. PMID:29765669
NASA Astrophysics Data System (ADS)
De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael
2017-04-01
Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M.: "Flexible Simulation Framework to Couple Processes in Complex 3D Models for Subsurface Utilization Assessment.", Energy Procedia, 97, 2016 p. 494-501.
Sniehotta, Falko F; Gellert, Paul; Witham, Miles D; Donnan, Peter T; Crombie, Iain K; McMurdo, Marion E T
2013-09-08
Physical activity (PA) in older adults is influenced by a range of environmental, demographic, health-related, social, and psychological variables. Social cognitive psychological models assume that all influences on behaviour operate indirectly through the models constructs, i.e., via intention and self-efficacy. We evaluated direct, indirect, and moderating relationships of a broad range of external variables with physical activity levels alongside intention and self-efficacy. We performed a cross-sectional survey of a representative and stratified (65-80 and 80+ years; deprived and affluent) sample of 584 community-dwelling people, resident in Scotland. Objectively measured physical activity and questionnaire data were collected. Self-efficacy showed unique relationships with physical activity, controlling for demographic, mental health, social, environmental, and weather variables separately, but the relationship was not significant when controlling for physical health. Overall, results indicating support for a mediation hypothesis, intention and self-efficacy statistically mediate the relationship of most domain variables with physical activity. Moderation analyses show that the relationship between social cognitions and physical activity was stronger for individuals with better physical health and lower levels of socio-economic deprivation. Social cognitive variables reflect a range of known environmental, demographic, health-related and social correlates of physical activity, they mediate the relationships of those correlates with physical activity and account for additional variance in physical activity when external correlates are controlled for, except for the physical health domain. The finding that the social cognition-physical activity relationship is higher for participants with better health and higher levels of affluence raises issues for the applicability of social cognitive models to the most disadvantaged older people.
Bang, Kyung Sook; Lee, In Sook; Kim, Sung Jae; Song, Min Kyung; Park, Se Eun
2016-02-01
This study was performed to determine the physical and psychological effects of an urban forest-walking program for office workers. For many workers, sedentary lifestyles can lead to low levels of physical activity causing various health problems despite an increased interest in health promotion. Fifty four office workers participated in this study. They were assigned to two groups (experimental group and control group) in random order and the experimental group performed 5 weeks of walking exercise based on Information-Motivation-Behavioral skills Model. The data were collected from October to November 2014. SPSS 21.0 was used for the statistical analysis. The results showed that the urban forest walking program had positive effects on the physical activity level (U=65.00, p<.001), health promotion behavior (t=-2.20, p=.033), and quality of life (t=-2.42, p=.020). However, there were no statistical differences in depression, waist size, body mass index, blood pressure, or bone density between the groups. The current findings of the study suggest the forest-walking program may have positive effects on improving physical activity, health promotion behavior, and quality of life. The program can be used as an effective and efficient strategy for physical and psychological health promotion for office workers.
Is the use of physical discipline associated with aggressive behaviors in young children?
Thompson, Richard; Kaczor, Kim; Lorenz, Douglas J.; Bennett, Berkeley L.; Meyers, Gabriel; Pierce, Mary Clyde
2016-01-01
Objectives To determine the association between use of physical discipline and parental report of physically aggressive child behaviors in a cohort of young children who were without indicators of current or past physical abuse Methods The data for this study were analyzed from an initial cohort of patients enrolled in a prospective, observational, multi-center PED-based study investigating bruising and familial psychosocial characteristics of children less than four years of age. Over a seven-month period, structured parental interviews were conducted regarding disciplinary practices, reported child behaviors, and familial psychosocial risk factors. Children with suspected physical abuse were excluded from this study. Trained study staff collected data using standardized questions. Consistent with grounded theory, qualitative coding by two independent individuals was performed using domains rooted in the data. Inter-rater reliability of the coding process was evaluated using the kappa statistic. Descriptive statistics were calculated and multiple logistic regression modeling performed. Results 372 parental interviews were conducted. Parents who reported using physical discipline were 2.8 [95% CI 1.7, 4.5] times more likely to report aggressive child behaviors of hitting/kicking and throwing. Physical discipline was utilized on 38% of children overall, and was 2.4 [95% CI 1.4, 4.1] times more likely to be utilized in families with any of the psychosocial risk factors examined. Conclusions Our findings indicated that the use of physical discipline was associated with higher rates of reported physically aggressive behaviors in early childhood as well as with the presence of familial psychosocial risk factors. PMID:26924534
Baka, Łukasz
2015-01-01
The aim of the study was to investigate the direct and indirect - mediated by job burnout - effects of job demands on mental and physical health problems. The Job Demands-Resources model was the theoretical framework of the study. Three job demands were taken into account - interpersonal conflicts at work, organizational constraints and workload. Indicators of mental and physical health problems included depression and physical symptoms, respectively. Three hundred and sixteen Polish teachers from 8 schools participated in the study. The hypotheses were tested with the use of tools measuring job demands (Interpersonal Conflicts at Work, Organizational Constraints, Quantitative Workload), job burnout (the Oldenburg Burnout Inventory), depression (the Beck Hopelessness Scale), and physical symptoms (the Physical Symptoms Inventory). The regression analysis with bootstrapping, using the PROCESS macros of Hayes was applied. The results support the hypotheses partially. The indirect effect and to some extent the direct effect of job demands turned out to be statistically important. The negative impact of 3 job demands on mental (hypothesis 1 - H1) and physical (hypothesis 2 - H2) health were mediated by the increasing job burnout. Only organizational constraints were directly associated with mental (and not physical) health. The results partially support the notion of the Job Demands-Resources model and provide further insight into processes leading to the low well-being of teachers in the workplace. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai
We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less
Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai
2018-03-01
We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less
Ordered phase and non-equilibrium fluctuation in stock market
NASA Astrophysics Data System (ADS)
Maskawa, Jun-ichi
2002-08-01
We analyze the statistics of daily price change of stock market in the framework of a statistical physics model for the collective fluctuation of stock portfolio. In this model the time series of price changes are coded into the sequences of up and down spins, and the Hamiltonian of the system is expressed by spin-spin interactions as in spin glass models of disordered magnetic systems. Through the analysis of Dow-Jones industrial portfolio consisting of 30 stock issues by this model, we find a non-equilibrium fluctuation mode on the point slightly below the boundary between ordered and disordered phases. The remaining 29 modes are still in disordered phase and well described by Gibbs distribution. The variance of the fluctuation is outlined by the theoretical curve and peculiarly large in the non-equilibrium mode compared with those in the other modes remaining in ordinary phase.
Cox process representation and inference for stochastic reaction-diffusion processes
NASA Astrophysics Data System (ADS)
Schnoerr, David; Grima, Ramon; Sanguinetti, Guido
2016-05-01
Complex behaviour in many systems arises from the stochastic interactions of spatially distributed particles or agents. Stochastic reaction-diffusion processes are widely used to model such behaviour in disciplines ranging from biology to the social sciences, yet they are notoriously difficult to simulate and calibrate to observational data. Here we use ideas from statistical physics and machine learning to provide a solution to the inverse problem of learning a stochastic reaction-diffusion process from data. Our solution relies on a non-trivial connection between stochastic reaction-diffusion processes and spatio-temporal Cox processes, a well-studied class of models from computational statistics. This connection leads to an efficient and flexible algorithm for parameter inference and model selection. Our approach shows excellent accuracy on numeric and real data examples from systems biology and epidemiology. Our work provides both insights into spatio-temporal stochastic systems, and a practical solution to a long-standing problem in computational modelling.
NASA Technical Reports Server (NTRS)
Korram, S.
1977-01-01
The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.
Scaling in geology: landforms and earthquakes.
Turcotte, D L
1995-01-01
Landforms and earthquakes appear to be extremely complex; yet, there is order in the complexity. Both satisfy fractal statistics in a variety of ways. A basic question is whether the fractal behavior is due to scale invariance or is the signature of a broadly applicable class of physical processes. Both landscape evolution and regional seismicity appear to be examples of self-organized critical phenomena. A variety of statistical models have been proposed to model landforms, including diffusion-limited aggregation, self-avoiding percolation, and cellular automata. Many authors have studied the behavior of multiple slider-block models, both in terms of the rupture of a fault to generate an earthquake and in terms of the interactions between faults associated with regional seismicity. The slider-block models exhibit a remarkably rich spectrum of behavior; two slider blocks can exhibit low-order chaotic behavior. Large numbers of slider blocks clearly exhibit self-organized critical behavior. Images Fig. 6 PMID:11607562
Arce, Pedro; Lagares, Juan Ignacio
2018-01-25
We have verified the GAMOS/Geant4 simulation model of a 6 MV VARIAN Clinac 2100 C/D linear accelerator by the procedure of adjusting the initial beam parameters to fit the percentage depth dose and cross-profile dose experimental data at different depths in a water phantom. Thanks to the use of a wide range of field sizes, from 2 × 2 cm 2 to 40 × 40 cm 2 , a small phantom voxel size and high statistics, fine precision in the determination of the beam parameters has been achieved. This precision has allowed us to make a thorough study of the different physics models and parameters that Geant4 offers. The three Geant4 electromagnetic physics sets of models, i.e. Standard, Livermore and Penelope, have been compared to the experiment, testing the four different models of angular bremsstrahlung distributions as well as the three available multiple-scattering models, and optimizing the most relevant Geant4 electromagnetic physics parameters. Before the fitting, a comprehensive CPU time optimization has been done, using several of the Geant4 efficiency improvement techniques plus a few more developed in GAMOS.
The Role of Probability in Developing Learners' Models of Simulation Approaches to Inference
ERIC Educational Resources Information Center
Lee, Hollylynne S.; Doerr, Helen M.; Tran, Dung; Lovett, Jennifer N.
2016-01-01
Repeated sampling approaches to inference that rely on simulations have recently gained prominence in statistics education, and probabilistic concepts are at the core of this approach. In this approach, learners need to develop a mapping among the problem situation, a physical enactment, computer representations, and the underlying randomization…
Estimation of the amount of asbestos-cement roofing in Poland.
Wilk, Ewa; Krówczyńska, Małgorzata; Pabjanek, Piotr; Mędrzycki, Piotr
2017-05-01
The unique set of physical and chemical properties has led to many industrial applications of asbestos worldwide; one of them was roof covering. Asbestos is harmful to human health, and therefore its use was legally forbidden. Since in Poland there is no adequate data on the amount of asbestos-cement roofing, the objective of this study was to estimate its quantity on the basis of physical inventory taking with the use of aerial imagery, and the application of selected statistical features. Data pre-processing and analysis was executed in R Statistical Environment v. 3.1.0. Best random forest models were computed; model explaining 72.9% of the variance was subsequently used to prepare the prediction map of the amount of asbestos-cement roofing in Poland. Variables defining the number of farms, number and age of buildings, and regional differences were crucial for the analysis. The total amount of asbestos roofing in Poland was estimated at 738,068,000 m 2 (8.2m t). It is crucial for the landfill development programme, financial resources distribution, and application of monitoring policies.
Statistical Physics of Population Genetics in the Low Population Size Limit
NASA Astrophysics Data System (ADS)
Atwal, Gurinder
The understanding of evolutionary processes lends itself naturally to theory and computation, and the entire field of population genetics has benefited greatly from the influx of methods from applied mathematics for decades. However, in spite of all this effort, there are a number of key dynamical models of evolution that have resisted analytical treatment. In addition, modern DNA sequencing technologies have magnified the amount of genetic data available, revealing an excess of rare genetic variants in human genomes, challenging the predictions of conventional theory. Here I will show that methods from statistical physics can be used to model the distribution of genetic variants, incorporating selection and spatial degrees of freedom. In particular, a functional path-integral formulation of the Wright-Fisher process maps exactly to the dynamics of a particle in an effective potential, beyond the mean field approximation. In the small population size limit, the dynamics are dominated by instanton-like solutions which determine the probability of fixation in short timescales. These results are directly relevant for understanding the unusual genetic variant distribution at moving frontiers of populations.
Preliminary Statistics from the NASA Alphasat Beacon Receiver in Milan, Italy
NASA Technical Reports Server (NTRS)
Zemba, Michael J.; Nessel, James A.; Morse, Jacquelynne R.
2015-01-01
NASA Glenn Research Center (GRC) and the Politecnico di Milano (POLIMI) have initiated a joint propagation campaign within the framework of the Alphasat propagation experiment to characterize rain attenuation, scintillation, and gaseous absorption effects of the atmosphere in the 40 GHz band. NASA GRC has developed and installed a KQ-band (20-40 GHz) beacon receiver at the POLIMI campus in Milan, Italy, which receives the 20-40 GHz signals broadcast from the Alphasat Aldo Paraboni TDP5 beacon payload. The primary goal of these measurements is to develop a physical model to improve predictions of communications systems performance within the Q-band. Herein, we provide an overview of the design and data calibration procedure, and present 6 months of preliminary statistics of the NASA propagation terminal, which has been installed and operating in Milan since May 2014. The Q-band receiver has demonstrated a dynamic range of 40 dB at an 8-Hz sampling rate. A weather station with an optical disdrometer is also installed to characterize rain drop size distribution for correlation with physical based models
Preliminary Statistics from the NASA Alphasat Beacon Receiver in Milan, Italy
NASA Technical Reports Server (NTRS)
Nessel, James; Zemba, Michael; Morse, Jacquelynne; Luini, Lorenzo; Riva, Carlo
2015-01-01
NASA Glenn Research Center (GRC) and the Politecnico di Milano (POLIMI) have initiated a joint propagation campaign within the framework of the Alphasat propagation experiment to characterize rain attenuation, scintillation, and gaseous absorption effects of the atmosphere in the 40 gigahertz band. NASA GRC has developed and installed a K/Q-band (20/40 gigahertz) beacon receiver at the POLIMI campus in Milan, Italy, which receives the 20/40 gigahertz signals broadcast from the Alphasat Aldo Paraboni TDP no. 5 beacon payload. The primary goal of these measurements is to develop a physical model to improve predictions of communications systems performance within the Q-band. Herein, we provide an overview of the design and data calibration procedure, and present 6 months of preliminary statistics of the NASA propagation terminal, which has been installed and operating in Milan since May 2014. The Q-band receiver has demonstrated a dynamic range of 40 decibels at an 8-hertz sampling rate. A weather station with an optical disdrometer is also installed to characterize rain drop size distribution for correlation with physical based models.
Application of physical scaling towards downscaling climate model precipitation data
NASA Astrophysics Data System (ADS)
Gaur, Abhishek; Simonovic, Slobodan P.
2018-04-01
Physical scaling (SP) method downscales climate model data to local or regional scales taking into consideration physical characteristics of the area under analysis. In this study, multiple SP method based models are tested for their effectiveness towards downscaling North American regional reanalysis (NARR) daily precipitation data. Model performance is compared with two state-of-the-art downscaling methods: statistical downscaling model (SDSM) and generalized linear modeling (GLM). The downscaled precipitation is evaluated with reference to recorded precipitation at 57 gauging stations located within the study region. The spatial and temporal robustness of the downscaling methods is evaluated using seven precipitation based indices. Results indicate that SP method-based models perform best in downscaling precipitation followed by GLM, followed by the SDSM model. Best performing models are thereafter used to downscale future precipitations made by three global circulation models (GCMs) following two emission scenarios: representative concentration pathway (RCP) 2.6 and RCP 8.5 over the twenty-first century. The downscaled future precipitation projections indicate an increase in mean and maximum precipitation intensity as well as a decrease in the total number of dry days. Further an increase in the frequency of short (1-day), moderately long (2-4 day), and long (more than 5-day) precipitation events is projected.
The GEOS Ozone Data Assimilation System: Specification of Error Statistics
NASA Technical Reports Server (NTRS)
Stajner, Ivanka; Riishojgaard, Lars Peter; Rood, Richard B.
2000-01-01
A global three-dimensional ozone data assimilation system has been developed at the Data Assimilation Office of the NASA/Goddard Space Flight Center. The Total Ozone Mapping Spectrometer (TOMS) total ozone and the Solar Backscatter Ultraviolet (SBUV) or (SBUV/2) partial ozone profile observations are assimilated. The assimilation, into an off-line ozone transport model, is done using the global Physical-space Statistical Analysis Scheme (PSAS). This system became operational in December 1999. A detailed description of the statistical analysis scheme, and in particular, the forecast and observation error covariance models is given. A new global anisotropic horizontal forecast error correlation model accounts for a varying distribution of observations with latitude. Correlations are largest in the zonal direction in the tropics where data is sparse. Forecast error variance model is proportional to the ozone field. The forecast error covariance parameters were determined by maximum likelihood estimation. The error covariance models are validated using x squared statistics. The analyzed ozone fields in the winter 1992 are validated against independent observations from ozone sondes and HALOE. There is better than 10% agreement between mean Halogen Occultation Experiment (HALOE) and analysis fields between 70 and 0.2 hPa. The global root-mean-square (RMS) difference between TOMS observed and forecast values is less than 4%. The global RMS difference between SBUV observed and analyzed ozone between 50 and 3 hPa is less than 15%.
NASA Technical Reports Server (NTRS)
Gardner, Adrian
2010-01-01
National Aeronautical and Space Administration (NASA) weather and atmospheric environmental organizations are insatiable consumers of geophysical, hydrometeorological and solar weather statistics. The expanding array of internet-worked sensors producing targeted physical measurements has generated an almost factorial explosion of near real-time inputs to topical statistical datasets. Normalizing and value-based parsing of such statistical datasets in support of time-constrained weather and environmental alerts and warnings is essential, even with dedicated high-performance computational capabilities. What are the optimal indicators for advanced decision making? How do we recognize the line between sufficient statistical sampling and excessive, mission destructive sampling ? How do we assure that the normalization and parsing process, when interpolated through numerical models, yields accurate and actionable alerts and warnings? This presentation will address the integrated means and methods to achieve desired outputs for NASA and consumers of its data.
Probabilistic models for reactive behaviour in heterogeneous condensed phase media
NASA Astrophysics Data System (ADS)
Baer, M. R.; Gartling, D. K.; DesJardin, P. E.
2012-02-01
This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.
Nurmi, Johanna; Hagger, Martin S; Haukkala, Ari; Araújo-Soares, Vera; Hankonen, Nelli
2016-04-01
This study tested the predictive validity of a multitheory process model in which the effect of autonomous motivation from self-determination theory on physical activity participation is mediated by the adoption of self-regulatory techniques based on control theory. Finnish adolescents (N = 411, aged 17-19) completed a prospective survey including validated measures of the predictors and physical activity, at baseline and after one month (N = 177). A subsample used an accelerometer to objectively measure physical activity and further validate the physical activity self-report assessment tool (n = 44). Autonomous motivation statistically significantly predicted action planning, coping planning, and self-monitoring. Coping planning and self-monitoring mediated the effect of autonomous motivation on physical activity, although self-monitoring was the most prominent. Controlled motivation had no effect on self-regulation techniques or physical activity. Developing interventions that support autonomous motivation for physical activity may foster increased engagement in self-regulation techniques and positively affect physical activity behavior.
NASA Astrophysics Data System (ADS)
Ferrara, Emilio
2015-03-01
Containing the spreading of crime in modern society in an ongoing battle: our understanding of the dynamics underlying criminal events and the motifs behind individuals therein involved is crucial to design cost-effective prevention policies and intervention strategies. During recent years we witnessed various research fields joining forces, sharing models and methods, toward modeling and quantitatively characterizing crime and criminal behavior.
NASA Astrophysics Data System (ADS)
Boning, Duane S.; Chung, James E.
1998-11-01
Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.
NASA Technical Reports Server (NTRS)
Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.;
2015-01-01
The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.
Longinetti, E; Mariosa, D; Larsson, H; Almqvist, C; Lichtenstein, P; Ye, W; Fang, F
2017-01-01
There is a clinical impression that patients with amyotrophic lateral sclerosis (ALS) have a higher level of physical fitness and lower body mass index (BMI) than average. However, there is a lack of literature examining the relationship between cognitive fitness and ALS risk. In this study we explored the associations of both physical and cognitive fitness with future risk of ALS. Data on physical fitness, BMI, intelligence quotient (IQ) and stress resilience were collected from 1 838 376 Swedish men aged 17-20 years at conscription during 1968-2010. Their subsequent ALS diagnoses were identified through the Swedish Patient Register. Hazard ratios (HRs) and 95% CIs from flexible parametric models were used to assess age-specific associations of physical fitness, BMI, IQ and stress resilience with ALS. We identified 439 incident ALS cases during follow-up (mean age at diagnosis: 48 years). Individuals with physical fitness above the highest tertile tended to have a higher risk of ALS before the age of 45 years (range of HRs: 1.42-1.75; statistically significant associations at age 41-43 years) compared with others. Individuals with BMI ≥ 25 tended to have a lower risk of ALS at all ages (range of HRs: 0.42-0.80; statistically significant associations at age 42-48 years) compared with those with BMI < 25. Individuals with IQ above the highest tertile had a statistically significantly increased risk of ALS at an age of 56 years and above (range of HRs: 1.33-1.81), whereas individuals with stress resilience above the highest tertile had a lower risk of ALS at an age of 55 years and below (range of HRs: 0.47-0.73). Physical fitness, BMI, IQ and stress resilience in young adulthood might be associated with the development of ALS at an early age. © 2016 EAN.
Frith, Emily
2017-01-01
Background Recent studies suggest social media may be an attractive strategy to promote mental health and wellness. There remains a need to examine the utility for individually tailored wellness messages posted to social media sites such as Facebook to facilitate positive psychological outcomes. Objective Our aim was to extend the growing body of evidence supporting the potential for social media to enhance mental health. We evaluated the influence of an 8-week social media intervention on anxiety in college students and examined the impact of dynamic (active) versus static (passive) Facebook content on physical activity behaviors. Methods Participants in the static group (n=21) accessed a Facebook page featuring 96 statuses. Statuses were intended to engage cognitive processes followed by behavioral processes of change per the transtheoretical model of behavior change. Content posted on the static Facebook page was identical to the dynamic page; however, the static group viewed all 96 statuses on the first day of the study, while the dynamic group received only 1 to 2 of these status updates per day throughout the intervention. Anxiety was measured using the Overall Anxiety Severity and Impairment Scale (OASIS). Time spent engaging in physical activity was assessed using the International Physical Activity Questionnaire (IPAQ). Results The OASIS change score for the dynamic Facebook group was statistically significant (P=.003), whereas the change score for the static group was not (P=.48). A statistically significant group-by-time interaction was observed (P=.03). The total IPAQ group-by-time interaction was not statistically significant (P=.06). Conclusions We observed a decrease in anxiety and increase in total physical activity for the dynamic group only. Dynamic social networking sites, featuring regularly updated content, may be more advantageous than websites that retain static content over time. Trial Registration ClinicalTrials.gov NCT03363737; https://clinicaltrials.gov/ct2/show/NCT03363737 (Archived by WebCite at http://www.webcitation.org/6vXzNbOWJ) PMID:29222077
Are there laws of genome evolution?
Koonin, Eugene V
2011-08-01
Research in quantitative evolutionary genomics and systems biology led to the discovery of several universal regularities connecting genomic and molecular phenomic variables. These universals include the log-normal distribution of the evolutionary rates of orthologous genes; the power law-like distributions of paralogous family size and node degree in various biological networks; the negative correlation between a gene's sequence evolution rate and expression level; and differential scaling of functional classes of genes with genome size. The universals of genome evolution can be accounted for by simple mathematical models similar to those used in statistical physics, such as the birth-death-innovation model. These models do not explicitly incorporate selection; therefore, the observed universal regularities do not appear to be shaped by selection but rather are emergent properties of gene ensembles. Although a complete physical theory of evolutionary biology is inconceivable, the universals of genome evolution might qualify as "laws of evolutionary genomics" in the same sense "law" is understood in modern physics.
Physical mechanism and numerical simulation of the inception of the lightning upward leader
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Qingmin; Lu Xinchang; Shi Wei
2012-12-15
The upward leader is a key physical process of the leader progression model of lightning shielding. The inception mechanism and criterion of the upward leader need further understanding and clarification. Based on leader discharge theory, this paper proposes the critical electric field intensity of the stable upward leader (CEFISUL) and characterizes it by the valve electric field intensity on the conductor surface, E{sub L}, which is the basis of a new inception criterion for the upward leader. Through numerical simulation under various physical conditions, we verified that E{sub L} is mainly related to the conductor radius, and data fitting yieldsmore » the mathematical expression of E{sub L}. We further establish a computational model for lightning shielding performance of the transmission lines based on the proposed CEFISUL criterion, which reproduces the shielding failure rate of typical UHV transmission lines. The model-based calculation results agree well with the statistical data from on-site operations, which show the effectiveness and validity of the CEFISUL criterion.« less
NASA Astrophysics Data System (ADS)
Arevalo, L.; Wu, D.; Jacobson, B.
2013-08-01
The main propose of this paper is to present a physical model of long air gap electrical discharges under positive switching impulses. The development and progression of discharges in long air gaps are attributable to two intertwined physical phenomena, namely, the leader channel and the streamer zone. Experimental studies have been used to develop empirical and physical models capable to represent the streamer zone and the leader channel. The empirical ones have led to improvements in the electrical design of high voltage apparatus and insulation distances, but they cannot take into account factors associated with fundamental physics and/or the behavior of materials. The physical models have been used to describe and understand the discharge phenomena of laboratory and lightning discharges. However, because of the complex simulations necessary to reproduce real cases, they are not in widespread use in the engineering of practical applications. Hence, the aim of the work presented here is to develop a model based on physics of the discharge capable to validate and complement the existing engineering models. The model presented here proposes a new geometrical approximation for the representation of the streamer and the calculation of the accumulated electrical charge. The model considers a variable streamer region that changes with the temporal and spatial variations of the electric field. The leader channel is modeled using the non local thermo-equilibrium equations. Furthermore, statistical delays before the inception of the first corona, and random distributions to represent the tortuous nature of the path taken by the leader channel were included based on the behavior observed in experimental tests, with the intention of ensuring the discharge behaved in a realistic manner. For comparison purposes, two different gap configurations were simulated. A reasonable agreement was found between the physical model and the experimental test results.
NASA Astrophysics Data System (ADS)
Widlowski, J.-L.; Pinty, B.; Lopatka, M.; Atzberger, C.; Buzica, D.; Chelle, M.; Disney, M.; Gastellu-Etchegorry, J.-P.; Gerboles, M.; Gobron, N.; Grau, E.; Huang, H.; Kallel, A.; Kobayashi, H.; Lewis, P. E.; Qin, W.; Schlerf, M.; Stuckens, J.; Xie, D.
2013-07-01
The radiation transfer model intercomparison (RAMI) activity aims at assessing the reliability of physics-based radiative transfer (RT) models under controlled experimental conditions. RAMI focuses on computer simulation models that mimic the interactions of radiation with plant canopies. These models are increasingly used in the development of satellite retrieval algorithms for terrestrial essential climate variables (ECVs). Rather than applying ad hoc performance metrics, RAMI-IV makes use of existing ISO standards to enhance the rigor of its protocols evaluating the quality of RT models. ISO-13528 was developed "to determine the performance of individual laboratories for specific tests or measurements." More specifically, it aims to guarantee that measurement results fall within specified tolerance criteria from a known reference. Of particular interest to RAMI is that ISO-13528 provides guidelines for comparisons where the true value of the target quantity is unknown. In those cases, "truth" must be replaced by a reliable "conventional reference value" to enable absolute performance tests. This contribution will show, for the first time, how the ISO-13528 standard developed by the chemical and physical measurement communities can be applied to proficiency testing of computer simulation models. Step by step, the pre-screening of data, the identification of reference solutions, and the choice of proficiency statistics will be discussed and illustrated with simulation results from the RAMI-IV "abstract canopy" scenarios. Detailed performance statistics of the participating RT models will be provided and the role of the accuracy of the reference solutions as well as the choice of the tolerance criteria will be highlighted.
Evaluating statistical cloud schemes: What can we gain from ground-based remote sensing?
NASA Astrophysics Data System (ADS)
Grützun, V.; Quaas, J.; Morcrette, C. J.; Ament, F.
2013-09-01
Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based remote sensing such as lidar, microwave, and radar to evaluate prognostic distribution moments using the "perfect model approach." This means that we employ a high-resolution weather model as virtual reality and retrieve full three-dimensional atmospheric quantities and virtual ground-based observations. We then use statistics from the virtual observation to validate the modeled 3-D statistics. Since the data are entirely consistent, any discrepancy occurring is due to the method. Focusing on total water mixing ratio, we find that the mean ratio can be evaluated decently but that it strongly depends on the meteorological conditions as to whether the variance and skewness are reliable. Using some simple schematic description of different synoptic conditions, we show how statistics obtained from point or line measurements can be poor at representing the full three-dimensional distribution of water in the atmosphere. We argue that a careful analysis of measurement data and detailed knowledge of the meteorological situation is necessary to judge whether we can use the data for an evaluation of higher moments of the humidity distribution used by a statistical cloud scheme.
Benchmarking test of empirical root water uptake models
NASA Astrophysics Data System (ADS)
dos Santos, Marcos Alex; de Jong van Lier, Quirijn; van Dam, Jos C.; Freire Bezerra, Andre Herman
2017-01-01
Detailed physical models describing root water uptake (RWU) are an important tool for the prediction of RWU and crop transpiration, but the hydraulic parameters involved are hardly ever available, making them less attractive for many studies. Empirical models are more readily used because of their simplicity and the associated lower data requirements. The purpose of this study is to evaluate the capability of some empirical models to mimic the RWU distribution under varying environmental conditions predicted from numerical simulations with a detailed physical model. A review of some empirical models used as sub-models in ecohydrological models is presented, and alternative empirical RWU models are proposed. All these empirical models are analogous to the standard Feddes model, but differ in how RWU is partitioned over depth or how the transpiration reduction function is defined. The parameters of the empirical models are determined by inverse modelling of simulated depth-dependent RWU. The performance of the empirical models and their optimized empirical parameters depends on the scenario. The standard empirical Feddes model only performs well in scenarios with low root length density R, i.e. for scenarios with low RWU compensation
. For medium and high R, the Feddes RWU model cannot mimic properly the root uptake dynamics as predicted by the physical model. The Jarvis RWU model in combination with the Feddes reduction function (JMf) only provides good predictions for low and medium R scenarios. For high R, it cannot mimic the uptake patterns predicted by the physical model. Incorporating a newly proposed reduction function into the Jarvis model improved RWU predictions. Regarding the ability of the models to predict plant transpiration, all models accounting for compensation show good performance. The Akaike information criterion (AIC) indicates that the Jarvis (2010) model (JMII), with no empirical parameters to be estimated, is the best model
. The proposed models are better in predicting RWU patterns similar to the physical model. The statistical indices point to them as the best alternatives for mimicking RWU predictions of the physical model.
Statistical Mechanics of the US Supreme Court
NASA Astrophysics Data System (ADS)
Lee, Edward D.; Broedersz, Chase P.; Bialek, William
2015-07-01
We build simple models for the distribution of voting patterns in a group, using the Supreme Court of the United States as an example. The maximum entropy model consistent with the observed pairwise correlations among justices' votes, an Ising spin glass, agrees quantitatively with the data. While all correlations (perhaps surprisingly) are positive, the effective pairwise interactions in the spin glass model have both signs, recovering the intuition that ideologically opposite justices negatively influence each another. Despite the competing interactions, a strong tendency toward unanimity emerges from the model, organizing the voting patterns in a relatively simple "energy landscape." Besides unanimity, other energy minima in this landscape, or maxima in probability, correspond to prototypical voting states, such as the ideological split or a tightly correlated, conservative core. The model correctly predicts the correlation of justices with the majority and gives us a measure of their influence on the majority decision. These results suggest that simple models, grounded in statistical physics, can capture essential features of collective decision making quantitatively, even in a complex political context.
[A strategic family medicine model for controlling borderline and mild arterial hypertension].
Uzcátegui Contreras, D; Granadillo Vera, D; Salinas, P J; Alvarez, N
1999-10-31
To research on the relationship of the patient and his/her family as a non-pharmacological factor for blood hypertension. To determine whether a hyposodic, hypocaloric, hypofat, and hypocholesterolemic diet decreases the blood tension. To determine whether physical exercises in the patient and his/her family help to reduce the hypertension. To observe whether the psychological therapy of muscles relaxation helps to reduce the hypertension. To evaluate in the sample of families, the experience of each member, as well as their suggestions and complaints about the programme. To design the strategic model to control the blood tension by ambulatory means. Controlled intervention study, descriptive, non-randomized, prospective. PLACEMENT: Primary care. Study group of 10 patients, 10 wives, and 12 children, and control group of 10 patients excluding family members. With both groups (study and control) there were meetings every 15 days for 6 months according to an established schedule. In the meetings there were given talks, pamphlets, physical exercises, muscles relaxation therapy, all about blood hypertension. There were questionnaires before and after each activity. MEASURING AND MAIN RESULTS: In both groups (study and control) there was a statistically significant (t < 0.01) reduction in the weight. The blood systolic tension decreased in both positions, seated and standing, in the study group (difference statistically significant) but not so in the control group, although there was a non-significant difference (decrease of 1.5 mmHg) in the seated position. The diastolic tension decreased significantly in the study group both in seated and standing positions, not so in the control group. The study sample showed that systolic tension seated and standing had a statistically significant reduction in the study group but not so in the control group. The weight had statistically significant reduction in both study and control groups. Total cholesterol had statistically significant decrease in the study group but not in the control group. HDL-C had statistically significant reduction in the study group; in the control group there was a decrease but not statistically significant. The triglycerides did not decrease statistically significant in any of the groups (study and control).
Non-equilibrium statistical mechanics theory for the large scales of geophysical flows
NASA Astrophysics Data System (ADS)
Eric, S.; Bouchet, F.
2010-12-01
The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.
A statistical model including age to predict passenger postures in the rear seats of automobiles.
Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J
2016-06-01
Few statistical models of rear seat passenger posture have been published, and none has taken into account the effects of occupant age. This study developed new statistical models for predicting passenger postures in the rear seats of automobiles. Postures of 89 adults with a wide range of age and body size were measured in a laboratory mock-up in seven seat configurations. Posture-prediction models for female and male passengers were separately developed by stepwise regression using age, body dimensions, seat configurations and two-way interactions as potential predictors. Passenger posture was significantly associated with age and the effects of other two-way interaction variables depended on age. A set of posture-prediction models are presented for women and men, and the prediction results are compared with previously published models. This study is the first study of passenger posture to include a large cohort of older passengers and the first to report a significant effect of age for adults. The presented models can be used to position computational and physical human models for vehicle design and assessment. Practitioner Summary: The significant effects of age, body dimensions and seat configuration on rear seat passenger posture were identified. The models can be used to accurately position computational human models or crash test dummies for older passengers in known rear seat configurations.
NASA Astrophysics Data System (ADS)
Gottschalk, Ian P.; Hermans, Thomas; Knight, Rosemary; Caers, Jef; Cameron, David A.; Regnery, Julia; McCray, John E.
2017-12-01
Geophysical data have proven to be very useful for lithological characterization. However, quantitatively integrating the information gained from acquiring geophysical data generally requires colocated lithological and geophysical data for constructing a rock-physics relationship. In this contribution, the issue of integrating noncolocated geophysical and lithological data is addressed, and the results are applied to simulate groundwater flow in a heterogeneous aquifer in the Prairie Waters Project North Campus aquifer recharge site, Colorado. Two methods of constructing a rock-physics transform between electrical resistivity tomography (ERT) data and lithology measurements are assessed. In the first approach, a maximum likelihood estimation (MLE) is used to fit a bimodal lognormal distribution to horizontal crosssections of the ERT resistivity histogram. In the second approach, a spatial bootstrap is applied to approximate the rock-physics relationship. The rock-physics transforms provide soft data for multiple point statistics (MPS) simulations. Subsurface models are used to run groundwater flow and tracer test simulations. Each model's uncalibrated, predicted breakthrough time is evaluated based on its agreement with measured subsurface travel time values from infiltration basins to selected groundwater recovery wells. We find that incorporating geophysical information into uncalibrated flow models reduces the difference with observed values, as compared to flow models without geophysical information incorporated. The integration of geophysical data also narrows the variance of predicted tracer breakthrough times substantially. Accuracy is highest and variance is lowest in breakthrough predictions generated by the MLE-based rock-physics transform. Calibrating the ensemble of geophysically constrained models would help produce a suite of realistic flow models for predictive purposes at the site. We find that the success of breakthrough predictions is highly sensitive to the definition of the rock-physics transform; it is therefore important to model this transfer function accurately.
Thermostatted kinetic equations as models for complex systems in physics and life sciences.
Bianca, Carlo
2012-12-01
Statistical mechanics is a powerful method for understanding equilibrium thermodynamics. An equivalent theoretical framework for nonequilibrium systems has remained elusive. The thermodynamic forces driving the system away from equilibrium introduce energy that must be dissipated if nonequilibrium steady states are to be obtained. Historically, further terms were introduced, collectively called a thermostat, whose original application was to generate constant-temperature equilibrium ensembles. This review surveys kinetic models coupled with time-reversible deterministic thermostats for the modeling of large systems composed both by inert matter particles and living entities. The introduction of deterministic thermostats allows to model the onset of nonequilibrium stationary states that are typical of most real-world complex systems. The first part of the paper is focused on a general presentation of the main physical and mathematical definitions and tools: nonequilibrium phenomena, Gauss least constraint principle and Gaussian thermostats. The second part provides a review of a variety of thermostatted mathematical models in physics and life sciences, including Kac, Boltzmann, Jager-Segel and the thermostatted (continuous and discrete) kinetic for active particles models. Applications refer to semiconductor devices, nanosciences, biological phenomena, vehicular traffic, social and economics systems, crowds and swarms dynamics. Copyright © 2012 Elsevier B.V. All rights reserved.
Vargas Corzo, M C; Aguilar Cordero, M J; de Teresa Galván, C; Segura Millán, D; Miranda Leon, M T; Castillo Rueda, G; Guisado Barrilao, R
2014-06-01
Intense physical exercise provoke muscle damage, that in sedentary people can increase cardiovascular risk. Phlebodium decumanum (PD) has shown to have immunomodulator effects in models of moderate intense physical activities in well conditioned groups. To evaluate the PD effects during eccentric exercise, as a model of muscle inflammation protocol, on a sedentary population with cardiovascular risk. This is an experimental, double-blind, multigroup randomized study. Experimental Group 1 (n = 17)received PD, 9 doses of 400 mg (total amount 3.6 g) every 8 hours during 3 days, and Control Group 2 (n = 16)received a placebo. All the subjects performed two treadmill ergoespirometry tests: first, a modified Bruce protocol to discard ischemic responses during exercise and to evaluate VO2max before the experimental phase;and second, with an eccentric protocol (14% descending ramp test) during 10 minutes in stable state at 70-80%VO2max, as experimental inflammatory protocol.We compared intra and inter groups to evaluate differences in the pre and post-test differences results on blood muscle damage variables. The study shown statistically significant differences in all pre-post intra-groups results in muscle damage variables (CK, LDH and Myoglobin, but not in Cardiac Troponin), and in functional lower-limb test (SJand CMJ). The comparison of inter-group results shown less muscle damage and less functional lower-limb deterioration in Group 1 compared with Control group, with statistical significance in both cases. Differences in handgrip dynamometry were no statistically significant. The eccentric exercise protocol in that study has proven to be a good model to induce muscle and functional damage in sedentary people. Short PD treatment has shown to reduce muscle and functional acute damages compared with placebo control group in this specific population. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Graphic analysis and multifractal on percolation-based return interval series
NASA Astrophysics Data System (ADS)
Pei, A. Q.; Wang, J.
2015-05-01
A financial time series model is developed and investigated by the oriented percolation system (one of the statistical physics systems). The nonlinear and statistical behaviors of the return interval time series are studied for the proposed model and the real stock market by applying visibility graph (VG) and multifractal detrended fluctuation analysis (MF-DFA). We investigate the fluctuation behaviors of return intervals of the model for different parameter settings, and also comparatively study these fluctuation patterns with those of the real financial data for different threshold values. The empirical research of this work exhibits the multifractal features for the corresponding financial time series. Further, the VGs deviated from both of the simulated data and the real data show the behaviors of small-world, hierarchy, high clustering and power-law tail for the degree distributions.
Stationarity: Wanted dead or alive?
Lins, Larry F.; Cohn, Timothy A.
2011-01-01
Aligning engineering practice with natural process behavior would appear, on its face, to be a prudent and reasonable course of action. However, if we do not understand the long-term characteristics of hydroclimatic processes, how does one find the prudent and reasonable course needed for water management? We consider this question in light of three aspects of existing and unresolved issues affecting hydroclimatic variability and statistical inference: Hurst-Kolmogorov phenomena; the complications long-term persistence introduces with respect to statistical understanding; and the dependence of process understanding on arbitrary sampling choices. These problems are not easily addressed. In such circumstances, humility may be more important than physics; a simple model with well-understood flaws may be preferable to a sophisticated model whose correspondence to reality is uncertain.
Physical Activity, Gender Difference, and Depressive Symptoms.
Zhang, Jun; Yen, Steven T
2015-10-01
To investigate the roles of physical activity (exercise) and sociodemographic factors in depressive symptoms among men and women in the United States. 2011 U.S. Behavioral Risk Factor Surveillance System (BRFSS). Patient Health Questionnaire Depression Scale (PHQ-8) scores are aggregated and divided into five categories. An ordered switching probability model with binary endogenous physical activity is developed to accommodate ordinality of depression categories and ameliorate statistical biases due to endogeneity of physical activity. Average treatment effects suggest physical activity ameliorates depressive symptoms among mildly and moderately depressed individuals, most notably among mildly depressed women. Gender differences exist in the roles of sociodemographic factors, with age, income, race, education, employment status, and recent mental health condition playing differentiated roles in affecting depressive symptoms. Regular physical activity reduces depressive symptoms among both men and women with mild to moderate depression, notably among women. © Health Research and Educational Trust.
NASA Astrophysics Data System (ADS)
Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav
2015-03-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.